DeepBlue Core

The DeepBlue Core is the trafficnow centralized software for interpreting wireless probe data in the most efficient way for providing robust travel-time information for motorized vehicles during all traffic conditions. The system is designed for generating travel times, congestion alarms and traffic data. The Core uses real-time data quality assessment, and adapts the data intervals according to the statistical probe quality. The system offers a series of different algorithms and filters to adjust to all types of roads and infrastructure.

DeepBlue Core can be deployed in all environments, and can be tuned to meet the different challenges found in a large urban, interurban or full metropolitan area. The DeepBlue Core algorithms. Each individual vector within a project can be tuned to meet the conditions of the surrounding infrastructure. The tuning can involve configuring a multitude of filters, or using a different algorithm altogether.

This way one can address several different situations in one project, including:

  • Streets or roads with separate bus lanes
  • Streets or roads with exceptionally slow traffic
  • Streets or roads with intersections, where:
    • The speed is homogeneous
    • The speed is heterogeneous, because:
      • A set of probes drive faster to catch green, and a set of probes drive slower, and get stuck on red
      • The travel times difference between right-turn, straight and left-turn are significant
  • Streets or roads with high density of pedestrians or bicycles
  • Ring-roads with adjacent streets or roads
  • Ring-roads or highways with large quantities of trucks


To assure robust travel-times 24 hours a day, 7 days a week the system deploys dynamic intervals. This means that for each travel-time update the system will automatically search for a sufficient amount of data for the calculations. During heavy traffic the number of homogenous samples is likely to be large; in such case the vector in question will use very recent data (for instance the last 3 minutes). During very light traffic, for instance in the middle of the night, traffic is scarce, and the vector may require a much longer time interval to gather enough data for the calculations.

A Key Performance Indicator in the DeepBlue Core centralized system is the Quality Factor (Q-Factor). The Q-Factor is based on a combination of a series of measurements; such as, for instance, number of quality samples (probes), number of non-quality samples (eliminated samples), recentness of probes, required time interval to achieve a robust travel-time and standard deviation/dispersion.

The Q-Factor serves two purposes; it has a proactive role as well as a reactive role. The proactive role is to guide the algorithms towards the required time interval for achieving a quality result. The reactive role is to report the quality of the statistical information to the traffic management system. Based on the Q-Factor the traffic management system can determine whether the data is trustworthy.

DeepBlue Core can also be used to collect RTMS data. The data is stored in the Core database, and a set of reports can be downloaded to Excel.



  • Dual processor
  • Min. 2.60 GHz octa-core
  • Min. 12Gb/s SAS RAID Supported Controller
  • Gigabit Ethernet
  • Min. 32 GB RAM
  • Linux Operating System


  • Web-based system
  • DeepBlue Sensor real-time operation
  • Travel-time
  • Congestion alarms
  • Data quality factor
  • RTMS data collection/reporting
  • Sensor status information
  • OSM GIS interface
  • Google maps support (license not included)



  • FTP-server for data reception
  • SSH-server for data reception
  • TCP-socket in server mode for data reception
  • TCP-socket in client mode for data reception
  • Data forwarding through XML
  • Web services data access (SOAP, REST)
  • Data download in XLS



  • Virtual Control Center
  • Locally installed server