Sales, Store & Xact
How does it work?
We have one main 'corporate' database behind a firewall. All the information that is required by the complete system is stored here. It typically sits in the company's headquarters. This database structure handles various user levels, and can handle many departments (analogous to small companies). It also caters for multiple stock pools, price breaks/groups, customers, sales and purchase orders, back orders, bin numbers, invoice production, order tracking, postcode groups and complex sales analysis.
The central database uses the Faircom server, and can run on many different platforms. This server allows almost 24/7 operation, with online backups, full transaction processing, and very fast operation. It can provide file mirroring, and heterogeneous network operation.
The Sales.exe application is used by telesales, accounts and marketing. This application uses the local area network to access the central database and provides most of the functionality outlined above. Sales.exe is a windows application. Typically telesales operators will process credit card transactions through a 'PDQ' machine, connected directly to a credit card service provider.
The Store and Xact applications can be compiled to run on almost any platform. They connect to your webserver via a protocol called FastCGI, which, unlike CGI, does not 'kill' the application after each request. Our applications handle not only multiple consecutive requests from the web server, but they also handle multiple requests at the same time. Typically, different requests may take different amounts of time, and some may have to wait for external events (such as a database to respond to a query). We 'share' cpu time between all the different requests, and any that are idle, release time back for others to use. This technique is called multi-threading, and combined with an efficient language like C++ allows us to handle many more transactions per server than other solutions.
Using FastCGI has another advantage; we can run the Store application on multiple servers, so we can truly say that we have a 'scalable' solution.
To continue: we have our main server app, Store, which serves (ie provides content via web server to user) the website. It does everything except credit card handling (i.e. like 'checkout' facilities in a supermarket). The Store app provides searching, categories, layout, baskets etc. Because the Store app uses sophisticated caching for it's data sources, it can be realistically located in another location to the central database. It has several internal 'management' threads, one of which takes care of cache synchronization.
The second server app, Xact, can be located elsewhere again. It connects to cc clearing facilities and Store app/s and uses encryption for all transactions. It verifies or declines transactions, and enables the transaction to occur 'offshore' if required. It's rather like a supermarket checkout line, with multi-threading and queueing.
Gong back to Store. What we now have are possibly thousands of anon users floating around our site, each with a 'ticket' (id). This keeps track of their basket. They're doing this on servers located close to them, so they get a fast connection. Because the site's content is completely dynamic i.e. created specifically for the user, we can serve the 'pages' in different languages, provide unlimited categories of product, handle different currencies, and correctly handle stock levels. No 'static' solution can do this.
When these users decide to purchase, they are directed to a secure server running Xact located possibly off-shore. This allows both parties to effect the transaction in a different jurisdiction, which can be mutually beneficial. This process is logged to the central database and once the transaction is verified, the Sales system outputs invoices and delivery information. Stock levels are continuously updated, and the goods are dispatched as in a normal telesales operation.
If users log out without purchasing, their 'user space' is freed for another user. If they don't logout (most don't) there is a user management thread which frees up idle 'user spaces'.
Each Store applicatation instance can handle several thousand users at the same time.