Data Calculation Engine
Our Analyser Processor
- Simple Concepts for comparing Time Series Data
- Inbuilt Aggregations to reduce data points
- Formula calculations to return results from different time slices
- Almost completely No Code
- Range Testing (Predicates) - evaluate your calculation results to pass or fail results
- Dynamic SQL Processing - Take any flat json data file, and put further set based analysis on this data
- Always have access to the data - all data is retrieved from and pushed to json data files
- Embed (Inject) new aggregations where required
- Rolling window analysis (time series shifts)
Explaining the Analyser Processor (AP) to a non-technical audience
There are many ways to process and analyse data, some are far better than others, but each approach will have major advantages and disadvantages. The biggest challenge is cost and time. We wrote the AP to require no programming skills. Somebody who can write basic excel formulas can use this.
Our work over the last few years at Info Rhino has featured a lot of effort on smaller utilities which can sit between larger systems. These smaller utilities do things these larger systems cannot, or would cost a huge amount of time to integrate.
The Analyser Processor does just that. It is a tool designed to provide powerful data calculation and analytic capabilities, without pretending to be a database, a cube, a big data store, or a machine learning package.
Main Functions within the Analyser Processor
Define which calculations you want produced, over desired time periods. The DCE will pull take any number of files and output these calculation results.
Takes calculation results and run formulas on them. Typically, we use it to compare aggregations from different time periods. These formula results are stored ready for further inspection.
Range Test Mode
For any formula result, we can test if this value meets one or more conditions to output a true or false result. This opens up massive potential to produce signals or markers.
Chart Processor mode
A main way users understand data and recognise patterns, is visually. Statistics, machine learning, artificial intelligence, business intelligence, still ends up being displayed visually to users. This mode puts the data into a chartable format.
Dynamic Data Mode
Any flat dataset can be loaded into our application, and your SQL to transform this. You could aggregate the data further, using advanced set based analysis native to the database.
If you are a software vendor in data - BI, ML, AI, or you think this can help your delivery, contact us please.
note: our current version uses SQL Server for the "Dynamic data mode", we are looking at open source databases for this task, to help our clients reduce licensing costs.
Technical thoughts on the Data Calculation Engine
Data Architects - Derisk your data function by reducing End User Computing
- Let analysts define calculations within our DCE
- Provide data the tool can analyse
- Consume data and analyst definitions without needing to navigate complicated spreadsheets
- Store configuration in document stores
- Build your own application management interfaces or export definitions dynamically
- Plug our application into your batches
Scalability - how performant is the DCE?
We don't recommend datasets with more than 100k data points. Simply break them into smaller files if required. Scaling out rather than scaling up may be a better approach.
More savvy users will recognise they may benefit doing certain processing independently from the DCE. SQL Server and Oracle have excellent ability to work with data in json format.
Costs and licence
Our application can serve private individuals wishing to get on top of automating data processing, the small business wishing to reduce costs of development, and the enterprise wishing to add major analytical capability throughout their data infrastructure.
Please note, we are still considering the best mechanism to deploy this software. Please don't hesitate to contact us with your thoughts.
Less than 10 Employees
2 years in business
Avoid expensive server software until your business is more established
Between 11 and 50 Employees
Less than 2 years in business
Regular data crunching as part of your daily business to better understand your operations, keep your customers better informed, track activity.
More than 75 Employees
You see this as a key and simple addition to your data function. This reduces development time when prototyping, reporting, analysing, and testing.
If you have multiple sites, please contact us to discuss.
Find out more
We are still considering the right way to deliver this product to individuals, as Â£3000 is too expensive for many. Please drop us an email with your potential use cases if you would like to know more.
How the Data Calculation Works
Our application has 5 core processing modes, you are free to decide which ones runs in what order. You may decide to chain together different instances or set up all steps in the same application instance.
A calculation is typically an aggregation happening over a defined time period. For example, we may ask for the total sales in the last 10 minutes. We may also want to know the maximum price paid for a product in that period also.
The application has a maximum of three time periods which sit within a window. For example, we could have 10 minutes, followed by 5 minutes, followed by 10 minutes. The window would be 25 minutes.
The engine moves through time series data, "bucketing" data into distinct datasets for local analysis.
Our application always works on the principle of; Ranges, Event Time, and Values.
We will want to focus on the same event time, and a combination of ranges and values.