Job Highlights
Monthly Salary: 35,000
Job Responsibilities:
- Design, develop, and maintain big data platforms (offline + real-time), building highly available data pipelines based on technologies such as Spark, Hive, Flink, Kafka, etc.; keep track of new industry technology trends and promote the upgrade and optimization of data architectures.
- Carry out data layered modeling (ODS/DWD/DWS/ADS) and theme domain division according to the characteristics of securities business. Develop ETL/ELT processes to complete the extraction, cleaning, transformation, and loading of multi-source heterogeneous data.
- Establish data quality monitoring, verification, and alarm mechanisms covering dimensions such as data integrity, consistency, and accuracy. Manage metadata, sort out data lineage, and formulate data governance specifications.
- Connect with business departments, understand business needs, develop data reports, indicator libraries, data interfaces, and data services. Support the implementation of business scenarios such as quantitative strategy backtesting and customer profiling.
- Optimize the performance of ETL tasks, data queries, storage, and other links, solve problems such as data delays and task failures. Rationally plan resource allocation, reduce operation and maintenance costs, and improve resource utilization efficiency.
- Comply with the regulatory requirements of the securities industry and the company's data security system. Implement data classification, desensitization processing, permission control, and operation audit. Ensure the security of core data.
- Write technical documents such as data dictionaries, interface documents, development specifications, and operation manuals. Collaborate with data analysts, product managers, and business personnel to promote the implementation of data requirements and ensure project progress and quality.
Entry Requirements:
- Master's degree or above, preferably in computer science, software engineering, statistics, or related fields. More than 5 years of work experience, including at least one year in data development in the finance/securities industry.
- Proficient in SQL, skilled in programming languages such as Python/Scala, and have solid experience in big data technology stacks (Hive/Spark/Flink/Kafka, etc.).
- Familiar with the theory of layered modeling of data warehouses. Those with experience in data modeling in the securities industry (market information, trading, risk control, etc.) are preferred.
- Have good problem-solving skills and be able to independently solve technical problems during data development and platform operation and maintenance.Have a strong sense of compliance and data security awareness, and understand the regulatory requirements of the financial industry.
- Have good cross-departmental communication skills and team spirit, and be able to withstand certain work pressure.
If you are interested in this job, please send your resume to enquiry@pinetone.com.hk. A shortlist of qualified applicants will be selected for interviews.
Pinestone Capital Limited 鼎石資本有限公司
鼎石資本有限公司
(Incorporated in the Cayman Islands with limited liability)
(Stock Code: 804)
About us
We are a Hong Kong-based financial services provider offering bespoke services to individual and corporate clients. Our main business comprises (i) securities brokerage; (ii) securities-backed lending and (iii) placing and underwriting services.
Data Development Engineer
Pinestone Capital Limited 鼎石資本有限公司

7 hours ago