By David Widerhorn, CEO, David Widerhorn Consulting LLC
Let’s face it: today’s futures markets aren’t what they were two decades ago – or even two years ago. The buzz-words “high frequency” and “algo” have gone from being a part of obscure conversation topics shared between graduate physics and computer science students to being a common focal point in today’s media — and rightfully so. Many studies today show that the majority of volume traded on futures exchanges is either high frequency or algorithmic in some fashion. We don’t need studies to tell us the market has changed – ask any ex-floor trader how the market digests new information differently. With more and more traders entering the “algo world”, how can a trader be sure they are investing in the right technology, developing it cost-efficiently and minimizing their time to market? With the advent of ultra-low latency global networks, advanced computer hardware, fully automated trading software and countless other technologies, there are seemingly an infinite number of ways to try to capture edge in today’s markets. While this is exciting from an opportunity perspective, it can be overwhelming to the individual trader or the small to medium sized prop firm looking to compete in the world of automated trading. It is important for these traders to get a realistic sense of where they stand to compete successfully against other market players and most importantly to understand the risks and rewards of each technology.
More than likely, your reality is that you are not Goldman Sachs or Citadel, and you don’t have a multi-million dollar technology budget. Even if you do, if you are starting now, you are late to the game. This limits the realistic scope of the type of strategies you can and should seek to deploy. For example, there are a handful of very large proprietary trading firms, hedge funds and investment banks that compete in cross-exchange arbitrage successfully. They have invested many millions of dollars to have the lowest latency connection lines by using technology such as millimeter wave radiation to move market data. They have hired PhDs to burn their algorithms onto FPGAs, ultra-fast computer hardware chips, allowing them to see and respond to an arbitrage opportunity before 99% of trading platforms even see
the tick on their screen. Knowing this – it is illogical to invest your hard-earned money into any technology solution that attempts to generate profits with this exact strategy – your chances of success are essentially zero. The reality check is that the majority of the “obvious”, minimal risk trading strategies have already been exploited by firms who have gone to great lengths to compete in that space for many years – making it not a viable choice for you.
The good news is that there is still a lot of opportunity out there. Most traders, once swallowing the pill that many of the old, minimal risk trades are gone, are able to still find edges by combining their trading experience with some of the more modern, quantitative approaches. I could write an entire article on how and where to find edges in today’s markets and some of the quantitative approaches, but that is not the focus of this article. Chances are whether you work alone or with a consulting firm to develop a trading concept – that concept will involve some kind of systematic, rule-based trading. Whether trading directly in outright symbols or utilizing a spreading engine, whether trading trend or fade, whether trading technical or price action, at the end of the day the work involved to implement your trading idea will boil down to two things: research and development. The unfortunate reality is that most traders, unless they have a quantitative or programming background, have a hard time managing time to market, cost, quality and results when it comes to building high tech trading solutions. As such, many are dissuaded from developing their custom trading concepts and instead opt for off the shelf solutions such as trading systems used by others or semi-automated order entry tools, limiting their true potential. This does not have to be the case – at all. In fact, when done correctly, a customized solution can be implemented with a very short time to market, for the same or lower cost as many off the shelf solutions and extremely effectively. But before we explain how – let me tell you a story I have seen over and over in my career (let’s call it “the traditional approach”):
The trader approaches a “boutique” consulting firm with a trading concept, eager to finally have an “algo edge”, and requests an estimate for both time and cost. Often a small consulting firm eager to win the project will bid the project low and over the course of a couple conversations with the trader develop a set of rules and requirements for the trading concept. Soon after, they begin software development and the trader eagerly awaits the completion of their software so they can finally “win” against the algos. A few weeks later, the consulting firm calls the trader, hands off the algo and the algo is profitable and successful.
Everything sounds good so far, so where’s the problem?
The problem is that the ending I just presented is never the one that happens in reality. In this example, both the consulting firm and the trader have shown their inexperience, and both will pay in different ways.
What happens in reality is that, assuming the consulting firm has perfect developers and develops perfect software (not realistic as “quality assurance” – aka bug fixes – are part of every engineering process) the trader will get a piece of software that accomplishes exactly what the simple set of rules and requirements for the trading concept dictated. The problem here is that this simple set of rules hasn’t been adequately researched. This presents several problems – first, the trader has no clue whether or not these rules will work to capture the edge they are looking for; second, the trader has no clue in which market or at what times it makes sense to deploy these rules and; third, the trader has no clue whether these rules are comprehensive enough to handle all possible market scenarios. This leaves the trader with a piece of software that somewhat accomplishes their goals – but more than likely has not generated a penny of profit. Moreover, the trader is upset that their expectations have not been met and wants the consulting firm to “fix” the problem, while the consulting firm demands more payment. The resources that have to be engaged to re-evaluate the business requirements and re-engineer the software become extremely costly. This leads to an unhappy relationship between the trader and the consulting firm, a loss of confidence in the trader being able to succeed algorithmically and overall, a highly inefficient situation. The core problem here is that mutual inexperience led to an improperly researched trading concept, money invested and expectations not being met.
The above story makes the case for why research and thorough specifications are critical to the success of any algorithmic trading project. But there is another risk that traders all too often overlook – and that is the risk of the technology itself, both from a correct functionality perspective and from a time and cost management perspective.Trading algorithms, no matter how simple they may seem, are complex pieces of software that must process real-time market data (which is not predictable), make complex decision orders which can be executed at any time. To be “high performance”, they must combine all of this logic in a way that can be concurrently computed, taking advantage of multiple processor cores. The complexities as the program must be able to keep track of its internal state and make decisions when multiple pieces of logic are affecting that state simultaneously. This type of engineering is similar to that used in robotics, automobiles, and other systems with multiple simultaneous inputs. The difference between your typical Fortune 500 company and a small consulting firm writing your “algo” – is
that the Fortune 500 company has a large team of people dedicated to spending weeks and months writing and testing extremely detailed specifications, an internal process of quality assurance involving separate teams of people, an ample budget and months to years to get a product to market. Given this difference, it is only natural that writing these systems from scratch, using a “hard-coded” approach that is specific to the algorithm being written, and having it written in a hurry – is an approach doomed to suffer from a lot of costly problems.
The key to high quality software, especially in an industry that requires low-latency real-time concurrent data processing such as trading, lies in proper, thoroughly researched design specifications and a scalable, modular approach to building software. This is why it is so critically important for traders to not look at bids from a price competitive perspective purely, but from the perspective of ensuring that their concept is adequately researched and that it is implemented in a manner that will enable them to easily adapt in the future whether it be to other execution venues, data sources or changes in the business logic. This sounds great in theory – but it doesn’t take a rocket scientist to realize that this “correct” approach could be very costly due to the man hours involved – since time and resources will be spent researching to developing business specifications which involves careful analysis of historical data and building simulation models. Time will also be spent developing the software in a manner that is modular and testable. Finally, time will be spent testing the software itself.
So how does a trader have his cake and eat it too? The answer can be found by using a Hybrid Consulting model. Under this model, research and development are unified into one effort using a series of pre-made industry tested modules as the base and custom software programming work as the connecting layer. Using this approach – the trading concept can be researched, developed and tested – all in the time it takes to do only one of the work tasks under the traditional consulting model. And most importantly, the software that is delivered acts not only as a trading algo, but as a simulation model.
Under the Hybrid Consulting (HC) model, a typical project would look like this:
The trader approaches a consulting firm that follows the HC model with a trading concept. The consulting firm works with the trader to understand the underlying alpha in the trading concept – whether it be price driven, volume driven, signal driven, etc. The firm then establishes a potential list of candidate markets and time windows, as well as a potential list of indicators to be used and signals to try and capture the alpha. At this point, the consulting firm is able to write the preliminary business specifications and select appropriate modules from their base of pre-made modules as the building blocks needed for the algorithm – such as low-latency connectivity to vendors or exchanges, technical indicators, etc.
Now here’s the “secret sauce” – all of these modules are designed generically and the business logic layer to capture the alpha is designed to interface with these modules – making the entire program data source independent. Now the consulting firm builds two “plugs” – one to a historical data and simulated matching engine replay mechanism, and the other, to the live data sources. This approach minimizes the trader’s exposure to costly bugs – allowing the consulting firm to continually test the “real” algo against the replay data and leveraging pre-made modules for the functionality that has the highest potential risk such as order submission. Equally as important, this approach allows the “algo” to generate both back and forward tested results and test the initially developed list of indicators and signals to see which best describes the alpha. Once the algo is fully optimized (in the same amount of time it would normally take just the research phase to be completed), it is ready for production. The end result is a fully functional piece of software that can function both as a simulation engine and a production trading engine that has been optimized to have the most accurate trading logic for the alpha the traders wants to capture. Even better, because the software is built modularly – with each module having a single responsibility – the software can be improved or changed at any time, by simply swapping one module or adding another. This allows an initial concept to be developed in a profitable and cost effective way that allows the trader to gradually reinvest profits into improving the indicators, signaling methodology and execution performance.
As a trader your career depends on making informed risk versus reward choices. If you wish to stay competitive in today’s markets, getting on-board with algo trading is not an option – it is a requirement. The right technology partner is clutch to ensuring that your hard-earned dollars are invested in technology that is re-usable, flexible, and scalable and that can minimize your time to market. In the long run, the missed profits from being out of the market, the losses from costly software “bugs” or gaps in the specifications, the cost of un-tested business logic and the frustration and fear of failure far outweigh any savings gained from cutting time and cost by not using a modular approach, and instead using a hard-code style of programming and skimping on research. With hybrid consulting, each dollar invested in technology is a dollar invested wisely.
David Widerhorn is the Chief Executive Officer of David Widerhorn Consulting, LLC (DWC). DWC is a Chicago-based boutique technology consulting firm focused on addressing the complex and dynamic needs of financial services firms. With a client base spanning over twenty countries and ranging from individual traders to investment banks and regulators, DWC specializes in both the research and development of advanced algorithmic trading solutions as well as the deployment of enterprise software solutions focused on risk management and compliance. Prior to his involvement with DWC, Mr. Widerhorn worked as a hybrid software developer-trader and developed some of the first successful high frequency trading systems for futures. He holds two degrees from MIT, one in Electrical Engineering & Computer Science from the School of Engineering and one in Finance from the Sloan School of Management, graduating at the age of 19.
More information can be found at www.davidwiderhorn.com