Kenetic Trading is a proprietary trading firm that deploys medium and high frequency trading strategies in the global cryptocurrency markets. Based in Hong Kong, we are a 24/7 business, trading digital assets in 10+ locations around the globe, with a team of world-class talent spanning 5 countries.
We are one of the fastest growing market makers in the entire industry. Our arbitrage engines generate tens of millions of orders per day, and our trading team combines a systematic approach with sophisticated pricing models, iterating regularly to ensure robust trading performance.
More information can be found on our company website: http://www.kenetic.capital
Summary of Position:
This is an opportunity to become both steward and champion for the firm's market and trading data archives and the systems which capture, transform and load trading data to and from those archives.
You’ll learn from our experienced trading team and help develop and support systems that execute millions of trades on ‘crypto’ exchanges across the globe.
Reporting to: Chief Technology Officer
Job responsibilities include:
- Assume end to end accountability for the warehoused data, helping modify trading systems to add telemetry as well as conducting the design and deployment of the data pipeline infrastructure
- Create tools to automate the configuration, deployment and troubleshooting of the data pipeline
- Develop strategies to make our data pipeline efficient, timely and robust in a 24/7 trading environment
- Implement monitoring that measures the completeness and accuracy of captured data
- Manage the impact that changes to trading systems and upstream protocols have on the data pipeline
- Back populate and clean historical datasets
- Collaborate with traders and trading system developers to understand our data analysis requirements, and to continue to improve the quality of our stored data
- Develop tools, APIs and screens to provide easy access to the archived data
- A Python programmer, experienced using asyncio
- 3+ years experience developing in C++, on linux
- Experience with developing real time large scale data pipelines, with petabytes of data
- Experience with distributed, high performance SQL and NoSQL database systems
- You have excellent written and verbal English
- You are reliable, take pride in delivering robust software and are willing to be on call to support the systems you develop.
- A bachelor's degree (or above) in Computer Science, Software Engineering or similar, with excellent results.
- Experience with AWS or other cloud computing infrastructure
- Linux system administration skills
- Protocol level network analysis experience
- Experience with terraform
- Experience with Hive and Hadoop
- Experience with cryptocurrency exchanges, and cryptocurrency trading