Monday, January 25, 2010

Pull-to-order Environment

This first case study illustrates usage of various types of kanban tickets described in the previous part of this series. In this case, the replenishment policies for the end item reflect a pull-to-order (PTO) manufactured kanban, whereas replenishing kanbans are used for the components. PTO represents a special case of make-to-order (MTO) manufacturing environments, with kanban tickets providing linkage to sales orders and acting as the primary coordination tool.

The case study involves a two-level product structure to build the end item, identified as Product #1 in figure 1. The left side of figure 1 depicts the product structure in terms of the bill of material (BOM) and routing. A final assembly cell produces Product #1 to sales order demand, and completed items are placed in a shipment staging area for subsequent shipment. Production of Product #1 requires Subassembly #1, Part #1A, and other purchased components. These purchased components are stocked next to the final assembly cell, with receipts direct to the location. An inventory location on the factory floor is commonly called a floor stock location or a supermarket location. The right side of figure 1 depicts the factory layout in terms of cells and inventory locations.


is produced to stock in the subassembly cell, using Part #1B and other purchased components. These purchased components are first received in a stockroom, and then transferred to the floor stock location next to the subassembly cell. Completions of Subassembly #1 are placed in the final assembly floor stock location.

Purchased kanbans provide the basis for replenishing purchased material stocked in the stockroom and the final assembly floor stock inventory locations. A purchased kanban acts as a signal to the vendor, and the kanban receipt transaction updates the item's inventory balance. Transfer kanbans replenish the floor stock inventory for the subassembly cell with material transferred from the stockroom. A transfer kanban acts as a signal to the stockroom, and the kanban receipt transaction transfers inventory between the two locations. The left side of figure 2 depicts these kanban signals and the associated kanban receipts.

Back to Apama (not Panama or Obama, Bozo!)

Progress Apama became part of Progress Software via the acquisition of the former Apama LTD in April of 2005. Apama is the core technology foundation for Progress’ initiatives in CEP and the company’s go-to-market initiatives that leverage that CEP platform in capital markets for the following “daily bread” actions: algorithmic trading, market aggregation, real-time pricing, smart order routing, and market surveillance.

Prior to its acquisition by Progress Software, Apama had a few dozen customers in London, New York, and Boston. Today, however, after leveraging the global parent’s infrastructure, Apama is marketed and sold in all the major financial centers in the world.

Apama was founded in 1999 in Cambridge (UK), by John Bates and Giles Nelson. Fellow Cantabrigians and CEP visionaries Bates and Nelson are co-holders of the patents on Apama’s core technology, which is a commercially-productized expression of their efforts to create a platform for the unique characteristics of “event-based” applications.

Originally, Apama had set out to try and resolve a number of telecommunications-based real-time mobility issues, but had then realized that there were additional commercial opportunities in a wide range of environments. As a result, the company has historically focused on financial markets and specifically financial trading systems where real-time event-based trading systems are in high demand.

The capital markets segment has indeed proven to be an early proof point for the Apama CEP platform. Apama’s design philosophy and architecture were intended to provide a platform that allows traders to quickly develop and deploy distinctive proprietary strategies that exploit these opportunities and mitigate risks.

In addition to the above-mentioned CEP applications in capital markets, other current (or future) uses in the segment are the following: commodities trading, bonds trading and pricing, foreign exchange (Forex) aggregation and algorithms, futures exchange and options algorithms, equities trading, cross-asset trading, real-time risk management, broker algorithms, news-driven algorithms, and so on.

Principles of CEP-based Systems

In plain English, CEP lands itself well to any environment that treats any business update as an “event.” Such organizations want to enable users to rapidly define event-based business rules to identify patterns indicating opportunities and threats to the business. These encapsulated rules (either as “if-then” statements or structural query language [SQL] statements) are loaded into a real-time computing (RTC) CEP engine.

The correlating engine is permanently connected to multiple event sources and destinations (with volumes of events and related data points) and offers analysis and response within an extremely low latency period. Events can be captured and preserved in time-order for a historical pattern analysis and root-cause analysis (RCA).

Given that algorithmic trading in capital markets was one of the first real-life applications of CEP, let’s translate the above general CEP principles into trading terms. The continuing digitization of financial market data and the advancement of electronic market access has created a market environment in which competitive differentiation amongst financial service firms rests with split-second algorithmic execution that can exploit minuscule and momentary advantages in price, time, and available liquidity.

To that end, a trading company will treat any market update as an “event” and will enable users to rapidly build quantitative algorithms (based on their vast experience and know-how) to identify trading opportunities and risk breaches. Germane trading rules are then loaded into a trading system that offers real-time analysis and response with a latency measured in milliseconds.

The trading system is permanently connected to a number of relevant market data sources, news-feeds, and trading venues (exchanges). Finally, events can be captured and preserved in time-order for backtesting and digital forensics analysis.

In summary, the drivers for CEP adoption are the following:

* Applications with high throughput and latency requirements. Such requirements from market trends such as higher velocity business event flows, more voluminous (and yet shorter-lived) transactions, and rapidly changing market conditions. These trends in turn pose the challenges onto customers in terms of how to detect opportunities and threats in real-time, and how to show the health of their business; and
* The need for rapid software development and customization, and increasing application complexity (temporal and/or spatial logic, real-time analytics, etc.). The customers’ challenge in this regard is how to accelerate the deployment of new capabilities.

Processing Complex Events (During these, oh well, Complex Times) – Part I

The worn-out saying about how we learn new things every day applies to this blog topic too. Namely, my interest in Progress Software Corporation has long been due to its renowned OpenEdge development platform. Indeed, many enterprise resource planning (ERP) and other applications providers leverage (embed) OpenEdge as Progress Software partners. Sure, I also follow and have recently written about the company’s forays in the service-oriented architecture (SOA) space with its two respective offerings: Actional for web services management and Sonic for enterprise service bus (ESB) and messaging.

But in late 2007, out of mere courtesy, I accepted a briefing about Progress Apama, the company’s platform for complex event processing (CEP), algorithmic trading, and whatnot. Given the overwhelming nature (“rocket science” of a sort) of the offering’s concept, I now admit that I could not wait for the briefing to end.

Actually, I felt bamboozled like those ordinary mortal FBI agents in CBS’ primetime hit show “Numb3rs.” In that show, time and again the whiz kid math genius (the brother of the FBI team leader) tries to explain to these action-rather-than-theory agents how some complex and arcane math theory can be applied to make sense out of seemingly chaotic and unrelated events. Eventually, complex math solves some important crimes, often by detecting patterns that are not obvious to the naked eye.

Well, fast forward to early 2009, where at Progress’ Analyst Summit (a traditional Boston winter fixture event) we could all find out that Progress Apama is possibly the best performing and growing part of the company. OpenEdge, while still contributing to over 60 percent to Progress’ total revenues, is a mature business that is now sold mostly to independent software vendors (ISVs). In addition, the recent financial markets (and consequently the overall economic) crisis and related cases of high-profile frauds (”white-collar crimes”) have made me conduct my own study of Apama and become familiar with its underlying concept.

Frankly, I no longer grapple as much with the concept of CEP per se (Progress Software refers to CEP as “The Brains of the High Velocity Business”). Where I still get lost though is when it comes to CEP’s relationships with other like technologies and concepts “du jour.”