DefCon 19 ReCap - Security When Nanoseconds Count

Posted August 15, 2011, 10:54 am by Steven Fox

Image of Steven

Steven Fox

High-speed trading - it's a reality often lost in the discussions of current financial challenges.  Much of the trading activity that drives the global economy occurs at speeds that exceed human capability.  Yes - computers conduct trades at nanosecond speeds based on mathematical algorithms.  These algorithms take into account information from global markets, futures trading, financial indicators, among other factors to make trading decisions.  Speed is money for these systems. Any devices that increases network latency - such as a firewall - means lost profits.

Security When Nanoseconds Count was a presentation at DefCon 19 where James Arlen discussed security in an environment where firewalls, access control lists, and system hardening are impractical given their negative impact on transaction speed. The threats landscape shows that these systems require non-technical solutions based on risk management and IT governance.



Vendors are a threat because they don't communicate when implementing protocols used in transaction timing. If a vendor patch changes the Precision Time Protocol, for example, to compute microseconds differently from other packages, the impact could extend to downstream systems that reference that time standard.


The development of high-speed trading algorithms is rarely mediated by a rigorous Software Development Lifecycle.  Poor coding and testing practices can result in application security vulnerabilities.  The threat is enhanced when you consider that these applications are developed by traders with programming knowledge rather than trained developers.  Additionally, the development lifecyle usually allows for on-the-fly changes to be made to production code; a practices that exposes systems to additional security risks.


The algorithms that drive these trades can be manipulated by trader/coders intent on harming other parties.  These individuals take advantage of the lack of routines that double check information used to make decisions.

Market Fluctuations

Prior to the extensive use of high-speed transaction systems, market "buffers" were utilized to lessen the impact of extreme market fluctuations.  The advent of these systems, however, resuled in the removal of these buffers.  Why?  The profit potential of these systems required that market information be provided in a continuous stream.  Unfortunately, events like the "Flash Crash" of 6/5/2010 could generate negative trading activity, intensifying the impact of the fluctuation.

What you can do.

  • Challenge your vendors to go beyond a checkbox mindset when developing solutions.  You should also work with vendorsthat have a mature SDLC in place and have a staff of experienced developers.
  • Work with the business stakeholders to understand the risks associated with these systems.  The risk model must then inform the organizational and IT governance controls used to mitigate business risks.  Risk modeling should also consider the needs of business partners.
Filed under: Uncategorized
Edited August 16, 2011 by Kim
Listed in Communities: Our Site

You must be logged in to post comments.