The client is a quantitative trading company with artificial intelligence as its core technology. It is powered with leading scientific research ability and rich technical experience and previously worked at Facebook, Apple, etc. It deeply integrates, improves machine learning algorithms, and applies them to financial data with a very low signal-to-noise ratio.
Within one year, it has rapidly developed to the scale of hundreds of millions of self-operating funds, and the trading volume has come out in front of the Chinese market.
- Build and maintain a distributed data processing platform for massive financial data to achieve high-reliability and large-throughput batch processing of historical data
- Develop and maintain a real-time data processing platform for real disks to achieve high-reliability, low-latency real-time data processing
- Develop and optimise data pipelines for different research needs
- Participate in the design and implementation of data processing monitoring and management systems
- Familiar with the Linux operating system,
- Familiar with one of the following: a. Familiar with C++14/17 and a scripting language / b. Familiar with Python 3 and a strongly typed language
- Familiar with best practices such as git, code review, CI/CD, etc.
- Have a strong interest in quantitative trading and machine learning