Database Replication Software

In 2008, EmpireCLS started the migration of their legacy Linux data to Microsoft SQL Server. A custom real-time data replication service was created to facilitate this first step in porting the legacy Linux C code to C# / .NET on windows.

A code generator was created that would parse C header files and create an XML model representing all of the data structures that were used by the existing ISAM database. This XML model was then used by Code Smith templates to generate SQL Server tables, procs, and triggers that would handle both the storage and access. We also generated a complete C# data access layer with Code Smith that would be used by the various services to interact with the database. The C# data access layer was to be the marshaling layer between the ISAM and SQL Server databases.

Technologies… C#, XML, Code Smith.

Information Management Software

In 2004 Deloitte faced many issues with their existing fat client audit software and decided it was time to implement a more enterprise wide solution to meet the geographically distributed needs of their business. The firm was not able to get information and guidelines to their auditors in a timely manner because it was a manual process. An enterprise system was designed to allow for real-time data exchange to and from the 40K+ auditors throughout the world. The new system would synchronize information between individual auditors with the global servers, as opposed to using CDs for information transfer. The system would now allow managers to view aggregate audit data from several auditors at once via dashboards.

The new system alleviated ongoing application maintenance issues that the firm was facing. Rather than having to distribute thousands of CDs, the auditors in the field were able to connect to the central servers and download the latest application software.

Internationalization requirements were easily met with the new system since all information that required localization were stored in centralized repositories and not hard coded within the client application. The information was able to be exported, translated by 3rd party consultants and the like, and then re-imported back into the system for synchronization with the auditor client applications.

This system required a distributed deployment environment across the globe. Each country, or region within a country, had their own centralized deployment. Each deployment was required to be available 24/7 so Microsoft clustering technologies were utilized to allow for active / active node clusters for the SQL server deployments. Clients connected to their regional application servers via a VPN connection, which then directed them, via a NLB (Network Load Balancer), to the application server farm.

We were also tasked with help bridge the gap between multiple teams both on premises and in Europe. This was one of the more exciting aspects of the project as we were able to build a true camaraderie with the European team where there were some issue previously.

The result of this product was a tool that allowed subject matter experts to author content that would streamline and simplify the audit process, increasing data capture validity, and decrease overall time spent on client engagements.

The application was built using win forms and accessed used a local SQL Server database. Background synchronization was used to bring down the latest content from a centralized SQL Server to the local client. Versioning needed to be taken into consideration so as to not break existing data relationships locally with new data from the server. The application was built using C#, XML, and ado.net for the data access.

Each regional / country deployment supported synchronization of 100K+ rows of data to 1K – 10K client users on a daily basis. Each regional / country deployment supported synchronization to central firm repositories for 1000K+ rows per week.

Auditing Software

In 2003 Deloitte asked us to help revamp the current client auditing software that was used by thousands of auditors in the field to facilitate auditing processes at client facilities in the field. The application was used a mix of document gathering, canned forms, wizards, and contextual guidance to help auditors gather the most relative information.

We were tasked with revamping the existing application, bringing up to date, and making it more of an online / offline experience that that the auditors could both receive the latest procedures and guidelines faster and to allow consolidation of data across multiple team members.

We were able to help guide the development teams technically and to build out some frameworks to be used in building the client application. We were then tasked with splitting off into a separate group to build the information management system.

The client was using the .NET framework (C#), local SQL Server, and .NET office integration.

Order Fulfillment Software

In 2000 Barnes & Noble asked us to help design and implement the systems used throughout their fulfillment center. The upcoming holiday season was a concern and this online bookseller faced the possibility of not being able to keep up with the shipping demands of their internet customers. Their legacy system was on its last leg and couldn’t keep up with the growing demands of the business. A scalable enterprise solution was implemented to meet the growing needs of the business, allowing the company to grow its online sales by a factor of x over the next several years. This project enabled the business to expand nationally and integrate with partners for parallel geographic order fulfillment. It also paved the way for a more automated warehouse floor, allowing the company to move from 4 shipping lanes to as many as the warehouse supported.

This project also removed the need for batched order fulfillment and gave way to real-time fulfillment. As inventory was coming in the back door and inducted into the warehouse management system it was also matched against outstanding orders and routed to the order assembly area for packing and shipping. A customer could order a book online which was not on inventory in the fulfillment center and still have it shipped out the same day.

By automating most of the existing manual processes barnesandnoble.com was able to see a huge efficiency increase within the warehouse, resulting in customers getting their books faster. I believe during this time a couple of the Harry Potter books were first release and I personally helped automate batch picking and packaging of nearly 100,000 customer orders.

Each module was created as a VB6 Active-X document hosted within IE 4. We chose this model because of the ease of deployment of the intranet and because we needed native code in the browser for integrating with scanners, printers, etc. The Active-X documents would interface with COM+ components via DCOM, which in turn would handle all the data access.

The system required 24/7 availability in order to support the 24/7 warehouse operations. In order to support this, several pairs of application servers were clustered together and accessed via virtual IP addresses from client workstations using the DCOM protocol. A clustered SQL server was used to handle all OLTP database transactions. An SQL replication was used to replicate OLTP data to a second clustered SQL server, which was used for OLAP mining and reporting.

Upon completion of this project we were seeing the following transnational statistics

  • 250K+ order matching transactions per day spread across 3 shifts of ~20 client workstations.
  • 150K+ order packing transactions per day spread across 3 shifts of ~10 client workstations.
  • 150K+ order shipping transactions per day spread across 3 shifts of ~5 client workstations.
  • 150K+ shipping confirmation e-mails per day.
  • 1000K+ OLTP to OLAP synchronization transactions per day.

Staff Notification Software

In 2000 Barnes & Noble asked us to help design and implement a back office email system that would be used to notify internal staff of various events occurring within the ordering process. I developed the service using component architecture with the intent of reuse for multiple developers.

By giving real-time notifications to staff they were able to react to ordering events quicker increasing customer satisfaction.

This system was developed as a clustered service on Windows NT 5.0 using SQL Server 6.5 as the back-end. The service received messages off of an MSMQ message queue, formatted the emails, then sent them out via SMTP through Microsoft Exchange.

The service itself was done in such a way as to allow other developers to easily create their own services based upon a set of base classes and interfaces. A developer needed only to develop a COM component and pass in the identifier to the service on the command line. The queuing logic and transactions were all orchestrated from within the service itself so developers didn’t need to understand the nuances of MSMQ, transaction boundaries, etc.

Incident Management Software

In 1998 Barnes & Noble wanted an internal system for tracking and reporting issues for their internal IT systems.

This was a web based reporting tool used internally to capture bugs and comments for the projects we were releasing.

Users were able to easily capture issues from their desktops using the web browser as opposed to documenting on paper or email. This allowed developers to use the captured data to triage issues.

The site was implemented in classic ASP using SQL Server 6.5 as the back-end. This was my introduction into web programming using a scripting language. I really enjoyed it and was extremely excited when I say that just pushing out a new .asp page and hitting F5 in the browser was all it took to see the changes.

Real-Time Demand Analysis Software

In 1998 Barnes & Noble asked us to help implement a real-time demand analysis tool to help with warehouse inventory issues they were facing.

This system was a real-time process that aggregated warehouse inventory allocations and presented various views of the data to users via online repots. The reports were able to show both real-time and historical demand.

B&N was able to react to spikes in inventory usage in real-time and mitigate “out of stock” scenarios.

The client had an existing SQL Server 6.5 database in place. We implemented a series of stored procedures and triggers that would run the aggregation logic in real-time. The SQL stored procedures and triggers needed to be highly optimized because they were run synchronously during the manipulation of the data. Crystal Reports was used as the reporting mechanism.