Real Time Billing Software

In 2005 Comverse asked us to help maintain their current set of real time billing webservices.

To cut our teeth we were tasked with maintenance of the current real-time billing web services backend. We basically just had to fix existing bugs. I believe we were able to start coding and contributing the very first day on site and continued cleaning everything up over a 2 month period.

Since we were in maintenance mode we were not able to create anything new but we were able to stabilize their existing system, provide more code coverage in unit tests, and decrease the overall number of issues they were tracking.

The existing application was vb.net with an Oracle backend. This was our first foray into vb.net having come from a primarily C# background. Was basically the same good ole .NET framework with a different syntax.

Information Management Software

In 2004 Deloitte faced many issues with their existing fat client audit software and decided it was time to implement a more enterprise wide solution to meet the geographically distributed needs of their business. The firm was not able to get information and guidelines to their auditors in a timely manner because it was a manual process. An enterprise system was designed to allow for real-time data exchange to and from the 40K+ auditors throughout the world. The new system would synchronize information between individual auditors with the global servers, as opposed to using CDs for information transfer. The system would now allow managers to view aggregate audit data from several auditors at once via dashboards.

The new system alleviated ongoing application maintenance issues that the firm was facing. Rather than having to distribute thousands of CDs, the auditors in the field were able to connect to the central servers and download the latest application software.

Internationalization requirements were easily met with the new system since all information that required localization were stored in centralized repositories and not hard coded within the client application. The information was able to be exported, translated by 3rd party consultants and the like, and then re-imported back into the system for synchronization with the auditor client applications.

This system required a distributed deployment environment across the globe. Each country, or region within a country, had their own centralized deployment. Each deployment was required to be available 24/7 so Microsoft clustering technologies were utilized to allow for active / active node clusters for the SQL server deployments. Clients connected to their regional application servers via a VPN connection, which then directed them, via a NLB (Network Load Balancer), to the application server farm.

We were also tasked with help bridge the gap between multiple teams both on premises and in Europe. This was one of the more exciting aspects of the project as we were able to build a true camaraderie with the European team where there were some issue previously.

The result of this product was a tool that allowed subject matter experts to author content that would streamline and simplify the audit process, increasing data capture validity, and decrease overall time spent on client engagements.

The application was built using win forms and accessed used a local SQL Server database. Background synchronization was used to bring down the latest content from a centralized SQL Server to the local client. Versioning needed to be taken into consideration so as to not break existing data relationships locally with new data from the server. The application was built using C#, XML, and ado.net for the data access.

Each regional / country deployment supported synchronization of 100K+ rows of data to 1K – 10K client users on a daily basis. Each regional / country deployment supported synchronization to central firm repositories for 1000K+ rows per week.

Auditing Software

In 2003 Deloitte asked us to help revamp the current client auditing software that was used by thousands of auditors in the field to facilitate auditing processes at client facilities in the field. The application was used a mix of document gathering, canned forms, wizards, and contextual guidance to help auditors gather the most relative information.

We were tasked with revamping the existing application, bringing up to date, and making it more of an online / offline experience that that the auditors could both receive the latest procedures and guidelines faster and to allow consolidation of data across multiple team members.

We were able to help guide the development teams technically and to build out some frameworks to be used in building the client application. We were then tasked with splitting off into a separate group to build the information management system.

The client was using the .NET framework (C#), local SQL Server, and .NET office integration.

Order Management Software

In 2002 Barnes & Noble asked us to help design and implement their new order management system. This was in preparation for the upcoming Holiday season and the concern was not being able to keep up with the ordering demands of their internet customers. The single monolithic Unix server couldn’t keep up, couldn’t scale, and was being manually tweaked on a 24 hour basis to keep the business running.

The replacement system would be geographically distributed and would easily scale horizontally, and allow for quick and agile expansion for the future.

We built this system on top of the “Order Fulfillment System” that we had built prior. It tied together multiple geographic deployments of the “Order Fulfillment System”, around 10 or so, with real-time order status synchronization from each fulfillment system.

We were extremely successful with this project and were able to complete on time and meet the holiday demand. This gave B&N breathing room to continue growing out their online presence which increased sales over time and allowed the customer base to grow.

Dozens of services were used to asynchronously process the orders, a poor man’s micro-service architecture. We had a direct feed from the web site via SQL Server 6.5 replication. We used a stored procedure that would put in bound orders into an MSMQ queue that would kick off the various processes. Email, credit cards, inventory allocations, shopping to geographically distributed fulfillment centers etc.

We developed a framework that would allow several developers to work in parallel and to rapidly build and deploy the various VB6 services. The VB6 services were deployed as generic applications on a Wolfpack cluster, Microsoft’s initial implementation of their application clustering service. The VB6 services interacted with MTS, Microsoft’s initial COM+ offering, components and used DTC to integrate the MSMQ and MTS transaction logic.

4 clustered SQL server deployments were used as the central data repository (2 for reporting purposes, 1 for CRM / order management, and 1 for fail-over). The primary CRM / order management cluster synchronizes data with the 2 reporting clusters.

4 clustered application server deployments were used to orchestrate the order processing. MSMQ was used to manage the workflow via a proprietary ESB (Enterprise Service Bus). The order processing workflow was broken down into several synchronous / asynchronous sub-processes which were then each distributed across several queues to sustain the throughput. The system was extremely scalable in the fact that if any sub-process required more processing or if there were a larger than normal influx of orders for the Christmas season, more queues could be added for the sub-process. When an application server reached its maximum throughput, a new application server could be added with its own set of processing queues, and the existing load could then be split.

Upon completion of this project we were seeing the following transaction statistics

  • 150K+ order confirmation e-mails per day
  • 500K+ item sourcing and inventory allocation transactions per day
  • 1000K+ item / order status change transactions per day
  • 10000K+ OLTP to OLAP synchronization transactions per day
  • 100K+ CRM transactions per day
  • 10000K+ inventory update transactions per day

Order Fulfillment Software

In 2000 Barnes & Noble asked us to help design and implement the systems used throughout their fulfillment center. The upcoming holiday season was a concern and this online bookseller faced the possibility of not being able to keep up with the shipping demands of their internet customers. Their legacy system was on its last leg and couldn’t keep up with the growing demands of the business. A scalable enterprise solution was implemented to meet the growing needs of the business, allowing the company to grow its online sales by a factor of x over the next several years. This project enabled the business to expand nationally and integrate with partners for parallel geographic order fulfillment. It also paved the way for a more automated warehouse floor, allowing the company to move from 4 shipping lanes to as many as the warehouse supported.

This project also removed the need for batched order fulfillment and gave way to real-time fulfillment. As inventory was coming in the back door and inducted into the warehouse management system it was also matched against outstanding orders and routed to the order assembly area for packing and shipping. A customer could order a book online which was not on inventory in the fulfillment center and still have it shipped out the same day.

By automating most of the existing manual processes barnesandnoble.com was able to see a huge efficiency increase within the warehouse, resulting in customers getting their books faster. I believe during this time a couple of the Harry Potter books were first release and I personally helped automate batch picking and packaging of nearly 100,000 customer orders.

Each module was created as a VB6 Active-X document hosted within IE 4. We chose this model because of the ease of deployment of the intranet and because we needed native code in the browser for integrating with scanners, printers, etc. The Active-X documents would interface with COM+ components via DCOM, which in turn would handle all the data access.

The system required 24/7 availability in order to support the 24/7 warehouse operations. In order to support this, several pairs of application servers were clustered together and accessed via virtual IP addresses from client workstations using the DCOM protocol. A clustered SQL server was used to handle all OLTP database transactions. An SQL replication was used to replicate OLTP data to a second clustered SQL server, which was used for OLAP mining and reporting.

Upon completion of this project we were seeing the following transnational statistics

  • 250K+ order matching transactions per day spread across 3 shifts of ~20 client workstations.
  • 150K+ order packing transactions per day spread across 3 shifts of ~10 client workstations.
  • 150K+ order shipping transactions per day spread across 3 shifts of ~5 client workstations.
  • 150K+ shipping confirmation e-mails per day.
  • 1000K+ OLTP to OLAP synchronization transactions per day.

Staff Notification Software

In 2000 Barnes & Noble asked us to help design and implement a back office email system that would be used to notify internal staff of various events occurring within the ordering process. I developed the service using component architecture with the intent of reuse for multiple developers.

By giving real-time notifications to staff they were able to react to ordering events quicker increasing customer satisfaction.

This system was developed as a clustered service on Windows NT 5.0 using SQL Server 6.5 as the back-end. The service received messages off of an MSMQ message queue, formatted the emails, then sent them out via SMTP through Microsoft Exchange.

The service itself was done in such a way as to allow other developers to easily create their own services based upon a set of base classes and interfaces. A developer needed only to develop a COM component and pass in the identifier to the service on the command line. The queuing logic and transactions were all orchestrated from within the service itself so developers didn’t need to understand the nuances of MSMQ, transaction boundaries, etc.

Incident Management Software

In 1998 Barnes & Noble wanted an internal system for tracking and reporting issues for their internal IT systems.

This was a web based reporting tool used internally to capture bugs and comments for the projects we were releasing.

Users were able to easily capture issues from their desktops using the web browser as opposed to documenting on paper or email. This allowed developers to use the captured data to triage issues.

The site was implemented in classic ASP using SQL Server 6.5 as the back-end. This was my introduction into web programming using a scripting language. I really enjoyed it and was extremely excited when I say that just pushing out a new .asp page and hitting F5 in the browser was all it took to see the changes.

Real-Time Demand Analysis Software

In 1998 Barnes & Noble asked us to help implement a real-time demand analysis tool to help with warehouse inventory issues they were facing.

This system was a real-time process that aggregated warehouse inventory allocations and presented various views of the data to users via online repots. The reports were able to show both real-time and historical demand.

B&N was able to react to spikes in inventory usage in real-time and mitigate “out of stock” scenarios.

The client had an existing SQL Server 6.5 database in place. We implemented a series of stored procedures and triggers that would run the aggregation logic in real-time. The SQL stored procedures and triggers needed to be highly optimized because they were run synchronously during the manipulation of the data. Crystal Reports was used as the reporting mechanism.

Book Layout Software

In 1998 Barnes & Noble asked us to design and implement an application that would help them visually layout their store shelves in the bargain book section.

Book and shelf dimensions were scaled to the computer display to allow the user to easily see how the books would fit on the shelves, color coordinate book covers, visually see the pricing, etc. Once completed the user would print out the visual layout and textual list to be given out to the stores to facilitate the physical layout of the shelves.

B&N was able to save labor costs and optimize their planning process for in store holiday campaigns.

This was a fun project as it involved a lot of graphical programming. Both Visual C++ and MFC were used entirely to build the MDI application. All layouts were saved in binary format to the local disk. In order to provide the maximum graphical fidelity 2 21 inch CRTs were used side by side with an extended desktop.