Freelancer with a lot of experience in web technologies, used to develop backend software of high complexity, robustness, and reliability, remaining faithful to the C programming language family: C, C++, C#, data definition and manipulation languages such as T-SQL, having varied experience in Microsoft technologies and products. This briefly describes my experience gained through years, started with IT enthusiasm which grew to high IT professionalism.
Deep knowledge of the following technologies and methodologies (to the right) represent my expertise. I divided them into three main areas—Web, Backend, and General. Needless to say, such a list never gets freezed—it's only a snapshot of my current collection of IT armoury, well, pretty squeezed.
A Gillette promotional web site for a number of countries in Latin and South America.
Implemented using ASP.NET MVC3,
Integrated with a couple of social networks, esp. Facebook,
Entity Framework code-first back-end.
Web site for “Facts Up Front”—a joint initiative of Grocery Manufacturers Association and the Food Marketing Institute. Feature rich back-office allows for a number of settings to be changed as well as some content, thus beeing a simple CMS (Contents Management System). Please, visit www.factsupfront.org to get familiar with the site's details.
Implemented using ASP.NET MVC3,
“SmoothScroll” jQuery plugin (see Recipes section),
Entity Framework back-end.
Web-based video collaboration application for British Telecom.
Browser hosted live video streaming,
Complicated session-based state machine implemented,
Flexible Entity Framework powered back-end supporting hundreds of hardware devices.
Restrictions apply, no more details available.#
The state of a shipment is usually checked by persons who ordered the delivery. When dealing with a single carrier the sender deals with a single site and becomes familiar with the way the site represents the data related to a shipment. If, however, multiple carriers are involved in the delivery process it becomes hard to find out where's the shipment and what is its status.
This project solves the problem by collecting data about shipments from various carriers and unifying it. Data is collected by parsing the HTML output by the carrier's server(s) as a response to users requests. It is kept in a MS SQL Server 2005 and is retrieved by web services accessing the data through multiple tiers.
Data Access Tier
The business entities and data access logic reside in the Data Access Layer. It was generated by a OR Mapper “SimpleGen”, I developped earlier.
Plenty of unit tests were written to provide both security in implementing all subsequent layers as well as a solid support for changing the code if and when necessary.
Business Logic Tier
Business classes were built on top of business entities defined in Data Access Layer. The layer provides business logic implementation as well as transaction processing for the underlying SQL server.
The layer contains all classes responsible for “grabbing” URLs and parsing them. A Provider Design Pattern was used to define concrette grabbers. A concrette grabber is configured by two configuration classes. Each carrier has its own dedicated grabber. Grabbers are implemente as plug-ins thus achieveding a high degree of code and functionality independence.
A specialized threaded scheduler is also located in the layer. It manages multiple threads executing simultaneously delivery states checks for mutliple shipments.
Shipment data is accessed and manipulated by a web service which also implements some functionality for controlling the threaded scheduler.
The ASP.NET based web site of a well known Bulgarian software company AloeCo, leader on the CRM and ERP software market. Since the company operates only on the Bulgarian market the site is only in Bulgarian.
Site's news were hard-coded which caused problems when adding new, or managing existing ones. Developed a three tier (DAL, BLL, Front-end) solution.
News used images which were scattered in the directory structure of the web site. Now all images are grouped into folders and both the images and the containing folders are stored in the MSSQL Server. A specialized HTTP handler (
.ashx) was created to serve image requests. The handler resized images to two standard formats used in the site—constrained and thumbnails used in the image catalogue viewer.
ImageLib was also implemented in a multitier manner.
URL Rewriting and SEO Optimization
Developed a (simple) URL rewriting framework handling both regular expressions mapper and SQL lookup rewriter for the site. Internaly the name is resolved to an
Guid) and the request is submitted for further processing following the normal ASP.NET pipe.
The software acquires images from a video input stream generated by cameras (e.g. webcams). The image acquisition process contained two parts:
Capturing video from the camera—I used Toshiba Camileo Camcorder for this purpose, and
Manipulating the resulting video as AVI format.
A single frame of the stream selected by operator was taken as a picture. Both parts of the project were implemented through calls to API exposed by avicap32.dll. The dll is part of Microsoft Windows operating system.
A good explanatin of how to use the API may be found here: http://www.devx.com/dotnet/Article/30375. The code is in VB.NET but may help in understanding the process.
(NB: I cannot publish the code which is not a translation to C# of the code in this article.)
As an aditinal functionality Windows Image Acquisition (WIA) was used to capture images with high resolution.' #