What Jobs are available for Data Pipeline in Hong Kong?
Showing 12 Data Pipeline jobs in Hong Kong
Data Architect, Data Architecture and Platform Team
Posted today
Job Viewed
Job Description
Responsibility:
- Design, implement and manage the enterprise data architecture and roadmap, including architecture vision, principles, standards and policies, target state blueprints, and roadmap for the relevant data domain.
- Implement DataOps practices for CI/CD in data workflows.
- Work with vendors and IT team on infrastructure alignment and security compliance, and to resolve technical issues.
Requirement:
- Degree holder in Computer Science, Information Technology, or relevant disciplines.
- 5 years f IT experience with at least 3 years of experience in data engineering.
- Solid understanding of enterprise data architecture, with hands-on experience in building or maintaining data platforms and cloud resources for data project implementation.
- Strong knowledge of SQL, data modelling, and ETL/ELT design.
- Deep Azure cloud expertise (e.g., Databricks, Data Factory).
- Actively engage in and contribute to relevant solution design review activities and data architecture review forum and conduct project technical reviews and health checks.
- Promote ideas for using effective technologies based on analysis of technology industry and market trends.
- Strong problem-solving and analytical skills.
- Excellent communication, leadership, and interpersonal skills.
- Experience in designing and deploying data applications on cloud solutions, such as Azure.
- Data management and governance related certificate is a plus
Interested parties please send full resume with employment history and expected salary to HRA Division, Yusen Logistics Global Management (Hong Kong) Limited at Level 33, Tower 1, Kowloon Commerce Centre, 51 Kwai Cheong Road, Kwai Chung or by email to @
Yusen Logistics Global Management (Hong Kong) Limited is an equal opportunity employer. All information collected will be used for recruitment purpose only.
<<
About Yusen Logistics
Yusen Logistics is working to become the world's preferred supply chain logistics company. Our complete offer is designed to forge better connections between businesses, customers and communities – through innovative supply chain management, freight forwarding, warehousing and distribution services. As a company we're dedicated to a culture of continuous improvement, ensuring everyone who works with us is committed, connected and creative in making us the world's preferred choice.
Is this job a match or a miss?
 
            
        
                                            
            
                 
            
        
                    Data Integration Specialist
Posted today
Job Viewed
Job Description
A leading global quantitative trading firm is seeking a Data Integration Specialist to join its data platform team.
Operating across all major asset classes worldwide, the firm applies a scientific, technology-first approach to systematic investing.
As part of this team, you'll play a critical role in building and maintaining the data infrastructure that powers trading and research.
What you'll do:
- Manage and optimise large, diverse datasets (structured and unstructured)
- Design and implement APIs and scalable data stores
- Monitor fetching processes and ensure data quality
- Integrate alternative and market datasets into research and trading workflows
- Support quants and traders with reliable, fast access to data
What we're looking for:
- 2+ years of experience in data engineering, integration, or a related field
- Strong Python skills, plus SQL and relational database knowledge
- Excellent communication skills in English
- Curiosity and drive to learn new technologies
- Ability to work independently within a global team
Desirable:
- Experience with financial/market data and vendor feeds
- Knowledge of equity or derivatives datasets
- Prior experience managing large-scale data pipelines
This is a chance to work at the core of a global trading operation, building the systems that enable some of the most advanced systematic strategies in the world.
Leading industry compensation & relocation on offer
Is this job a match or a miss?
 
            
        
                                            
            
                 
            
        
                    Data Integration Architect
Posted today
Job Viewed
Job Description
System Analyst, Data Integration, API design
Your New Company
Our client, a leading organisation in life science and healthcare industry, is looking for a Data Integration Architect. Join a work-life balance environment driving innovative data solutions in healthcare sector.
Your New Role
- Developing and maintaining APIs and data transfer workflows for external system integration.
- Designing strategies for high availability, replication, and disaster recovery using modern infrastructure tools.
- Implementing data security measures including access control, encryption, and audit logging.
- Monitoring and optimising system performance to improve efficiency and responsiveness.
- Ensuring data quality through validation, lineage tracking, and consistency checks.
- Collaborating with cross-functional teams to integrate diverse data types and formats.
What You'll Need to Succeed
- Bachelor's degree in Computer Science or related field with 5+ years of experience.
- Expertise in RESTful API design with good security sense
- Strong experience with data transfer protocols (e.g. HTTPS, AWS S3)
- Proficiency in data modelling, database design (PostgreSQL, MySQL, MongoDB), and programming , Java or C++).
- Hands-on experience with Kubernetes
What You Need to Do Now
Click 'apply now' to forward an up-to-date copy of your CV to or call for a confidential discussion.
Is this job a match or a miss?
 
            
        
                                            
            
                 
            
        
                    Data Integration Architect
Posted today
Job Viewed
Job Description
System Analyst, Data Integration, API design
Your New Company
Our client, a leading organisation in life science and healthcare industry, is looking for a Data Integration Architect. Join a work-life balance environment driving innovative data solutions in healthcare sector.   
Your New Role
- Developing and maintaining APIs and data transfer workflows for external system integration.
- Designing strategies for high availability, replication, and disaster recovery using modern infrastructure tools.
- Implementing data security measures including access control, encryption, and audit logging.
- Monitoring and optimising system performance to improve efficiency and responsiveness.
- Ensuring data quality through validation, lineage tracking, and consistency checks.
- Collaborating with cross-functional teams to integrate diverse data types and formats.
What You'll Need to Succeed
- Bachelor's degree in Computer Science or related field with 5+ years of experience.
- Expertise in RESTful API design with good security sense
- Strong experience with data transfer protocols (e.g. HTTPS, AWS S3)
- Proficiency in data modelling, database design (PostgreSQL, MySQL, MongoDB), and programming , Java or C++).
- Hands-on experience with Kubernetes
What You Need to Do Now
Click 'apply now' to forward an up-to-date copy of your CV to  or call for a confidential discussion. 
Is this job a match or a miss?
 
            
        
                                            
            
                 
            
        
                    AI/Data Integration System Engineer
Posted today
Job Viewed
Job Description
Client
: Investment Management 
Location
: Hong Kong 
Role
: Permanent 
We are seeking an
AI/Data Integration System Engineer (Python)
to manage our client's AI projects from start to finish, including quick prototypes and large-scale production. 
This role combines hands-on AI development—creating tools that improve trading and risk management—with data engineering to ensure everything runs smoothly.
You will
- Experiment with new AI methods, build efficient data pipelines, and integrate AI insights.
- Collaborate with innovative AI startups to identify valuable technologies and ensure our AI efforts are practical and forward-thinking
- Working closely with our Product Manager, engineers, and investment teams, you will prioritize projects and deliver effective solutions that meet our goals, while maintaining high standards of security and compliance in a regulated environment.
Backend Services & Pipelines
- Develop and maintain Python-based backend services and data processing pipelines.
AI Integration
- Build RAG pipelines for financial and unstructured data and integrate AI systems into workflows.
Infrastructure & Operations
- Monitor Kubernetes-based cloud infrastructure and optimize performance for stable deployments.
Looking for:
- Backend Engineering
 : Strong Python experience in building data pipelines.
- Database Knowledge
 : Familiarity with relational and distributed databases.
- Cloud Experience
 : Ability to manage services on cloud platforms (preferably AWS)
- RAG Workflows
 : Understanding of chunking, embedding, and vector search.
- Security Awareness
 : Knowledge of security frameworks and data protection laws (GDPR, HIPAA, CCPA).
We do not provide relocation assistance for candidates outside of Hong Kong. If you hold a Hong Kong work visa, dependent visa, IANG visa, or HK Top Talent visa, we encourage you to apply, and we will respond to your application.
Is this job a match or a miss?
 
            
        
                                            
            
                 
            
        
                    Data Engineering Lead
Posted today
Job Viewed
Job Description
Company Description
Oakham Partners is a bespoke recruitment services provider located in Hong Kong. We specialize in providing recruitment solutions for clients ranging from large corporations to SMEs and start-ups. Our expertise and industry knowledge allow us to find the best candidates for our clients' specific needs. We recruit specialists across various professional disciplines and cater to the local recruitment market by focusing on new and emerging areas. Our team of consultants are experts in their respective fields, with many having industry experience within the disciplines they recruit for. Whether it's permanent, contract, or interim professionals, we provide recruitment solutions worldwide.
Job Summary
We are seeking a highly experienced and strategic Data Engineering Lead to oversee the data engineering team and drive the implementation of advanced data and AI initiatives. The ideal candidate will have a strong background in data engineering, with significant exposure to data science and analytics, and a minimum of seven years of experience. This role requires a leader who can provide vision and technical guidance, foster a high-performing team culture, and effectively communicate complex data concepts to both technical and business stakeholders.
Key Responsibilities
Strategy & Leadership
- Provide overall strategic guidance for the data engineering roadmap, ensuring alignment with organizational objectives.
- Lead and mentor a data team, fostering a collaborative and high-performing culture to achieve strategic goals.
- Identify and present the business benefits and value propositions of potential data and AI use cases to stakeholders.
- Review and enhance the data team's operational model in conjunction with stakeholders to optimize efficiency and impact.
Technical Architecture & Development
- Build a scalable data framework using modern Azure technologies, including Azure Data Factory, Databricks, Azure Synapse, and Microsoft Fabric, to support downstream applications.
- Refactor redundant and low-efficiency code from legacy systems or vendors, including data pipelines, PySpark, SQL, and shell scripts.
- Oversee the implementation of AI solutions, to provide data analysis self-service capabilities.
- Ensure the data architecture incorporates core data management competencies, including data governance, data quality, and data security.
Team Coordination & Communication
- Coordinate and oversee the offshore team in Mainland China, ensuring project goals and timelines are met.
- Facilitate effective communication and collaboration between onshore and offshore teams and with other departments.
- Act as a technical liaison, translating complex data concepts for business stakeholders and guiding the team to deliver user-centric solutions.
Qualifications
- Bachelor's degree in a STEM field (Science, Technology, Engineering, or Mathematics) or equivalent experience.
- A minimum of 7+ Years of experience in data engineering, with demonstrated exposure to data science and analytics principles.
- Proven experience providing strategic direction and implementing complex data and AI solutions.
- Hands-on expertise with Azure data services, including Azure Data Factory, Databricks (PySpark), Azure Synapse, and Microsoft Fabric.
- Experience with other relevant technologies, such as SQL Server Integration Services (SSIS), MongoDB, and modern orchestration tools.
- Strong analytical, problem-solving, and leadership skills, with the ability to manage complex projects and mentor junior team members.
- Exceptional communication and stakeholder management skills, with the ability to influence and build consensus across different groups.
Is this job a match or a miss?
 
            
        
                                            
            
                 
            
        
                    Head of Data Engineering and Platform
Posted today
Job Viewed
Job Description
Key Responsibilities:
- Lead and mentor a team of data engineers, fostering a culture of innovation and continuous improvement.
- Develop and execute a comprehensive data engineering strategy aligned with the company's goals.
- Collaborate with cross-functional teams to understand data needs and translate them into actionable plans.
- Oversee the design, development, and maintenance of scalable data pipelines and architecture.
- Ensure the implementation of robust data governance practices and data quality standards.
- Evaluate and integrate new technologies and tools to enhance data processing capabilities.
- Manage multiple data engineering projects, ensuring timely delivery and adherence to budgets.
- Develop project plans, timelines, and resource allocation strategies to meet business demands.
- Coordinate with stakeholders to prioritize data initiatives and address any challenges.
- Work closely with data scientists, analysts, and other stakeholders to understand data requirements and deliver solutions.
- Communicate technical concepts to non-technical stakeholders and advocate for data-driven decision-making.
- Establish metrics and KPIs to measure the effectiveness of data engineering initiatives.
- Conduct regular assessments of data processes and workflows, identifying areas for optimization.
Qualifications:
- Bachelor's or Master's degree in Computer Science, Data Engineering, Information Technology, or a related field.
- 8+ years of experience in data engineering, with at least 3 years in a leadership role.
- Proven experience in building and managing data platforms in a cloud environment (e.g., AWS, Azure, GCP).
- Excellent leadership, communication, and interpersonal skills.
- Strong analytical and problem-solving abilities.
- Ability to work in a fast-paced, dynamic environment and adapt to changing priorities.
Is this job a match or a miss?
 
            
        
                                            
            
                 
            
        
                    Be The First To Know
About the latest Data pipeline Jobs in Hong Kong !
Head of Data Engineering and Platform (Major FSI) (Upto 170k)
Posted today
Job Viewed
Job Description
Key Responsibilities:
- Lead and mentor a team of data engineers, fostering a culture of innovation and continuous improvement.
- Develop and execute a comprehensive data engineering strategy aligned with the company's goals.
- Collaborate with cross-functional teams to understand data needs and translate them into actionable plans.
- Oversee the design, development, and maintenance of scalable data pipelines and architecture.
- Ensure the implementation of robust data governance practices and data quality standards.
- Evaluate and integrate new technologies and tools to enhance data processing capabilities.
- Manage multiple data engineering projects, ensuring timely delivery and adherence to budgets.
- Develop project plans, timelines, and resource allocation strategies to meet business demands.
- Coordinate with stakeholders to prioritize data initiatives and address any challenges.
- Work closely with data scientists, analysts, and other stakeholders to understand data requirements and deliver solutions.
- Communicate technical concepts to non-technical stakeholders and advocate for data-driven decision-making.
- Establish metrics and KPIs to measure the effectiveness of data engineering initiatives.
- Conduct regular assessments of data processes and workflows, identifying areas for optimization.
Qualifications:
- Bachelor's or Master's degree in Computer Science, Data Engineering, Information Technology, or a related field.
- 8+ years of experience in data engineering, with at least 3 years in a leadership role.
- Proven experience in building and managing data platforms in a cloud environment (e.g., AWS, Azure, GCP).
- Excellent leadership, communication, and interpersonal skills.
- Strong analytical and problem-solving abilities.
- Ability to work in a fast-paced, dynamic environment and adapt to changing priorities.
Is this job a match or a miss?
 
            
        
                                            
            
                 
            
        
                    Senior Data Management Professional - Data Engineering - APAC New Issues, Hong Kong
 
                        Posted 5 days ago
Job Viewed
Job Description
Location
Hong Kong
Business Area
Data
Ref #
**Description & Requirements**
Bloomberg runs on data. Our products are fueled by powerful information. We combine data and context to paint the whole picture for our clients, around the clock - from around the world. In Data, we are responsible for delivering this data, news and analytics through innovative technology - quickly and accurately. We apply problem-solving skills to identify innovative workflow efficiencies, and we implement technology solutions to enhance our systems, products and processes.
Our Team:
Our Corporate Bonds Primary Markets team is responsible for the timely and accurate publication of new bonds being brought to market. Responsibilities also include developing automated pipelines for ingestion and publication of corporate bonds data, enhancing our primary markets dataset, and finding opportunities to expand our data product offering through client engagement. Our team works closely with internal consumers of our data including the News, Pricing, and Index teams. We also heavily collaborate with our counterparts in Product, Sales & Engineering.
What's the role?
We are looking for a Senior Data Engineer with an outstanding combination of technical, interpersonal and communication skills, which they can apply to spot gaps and find improvement opportunities to our product's technologies. You will be seen as a technical leader across the organization. We are looking for a highly motivated individual who is passionate about building automated workflows and pipelines to support the transformation of the corporate bonds dataset. You will apply your problem-solving skills to handle the pipelines, automation processes and workflows that feed our products. In addition to that, you will be tasked with measuring and articulating the impact of your initiatives using business intelligence tools. You will be encouraged to lead projects globally, collaborate with partners across the business, and mentor junior members of the group who are looking to develop technically.
We'll Trust You To:
+ Understand financial markets and client needs to drive the corporate bonds data product strategy
+ Design, develop and maintain automated solutions via microservices for data ingestion, standardization, and ETL pipelining
+ Collaborate with a wide variety of external partners and internal departments including Engineering, Product and Sales on strategic product development and execution
+ Define, measure and manage the impact of your work using statistics and data visualization tools
+ Build quality data workflows to verify and validate third party data
+ Communicate with impact, ensuring relevant information is articulated in a meaningful way to wide and varied audiences
You'll Need to Have:
+ Excellent proficiency and fluency in English and Mandarin
+ Bachelor's degree or degree-equivalent qualifications
+ Demonstrated experience with corporate bonds and wider financial markets landscape
+ At least 4 years of experience working with data in the financial industry
+ 3 years of programming experience in a development and/or production environment using tools such as Python for data management and ETL pipelines development
+ Demonstrated experienced using scripting languages to build pre-processing services that can be integrated in our data pipelines
+ Strong critical-thinking and problem-solving skills, particularly to modify and improve processes and workflows
+ Excellent written and verbal communication skills to explain technical processes and solutions to business partners and management
+ Ability to work independently as well as in a multi-functional team environment
+ Mindset to challenge status quo and proven ability to influence others and lead change
+ Understanding of data quality standard methodologies to improve the value of the dataset
+ Proven track record of effective project management and a customer focused mentality
+ Demonstrated continuous career growth within an organization
We'd Love to See:
+ Familiarity with fixed income markets
+ Familiarity with data science and machine learning techniques to help with automation
+ Agile/Scrum project management experience
+ A proven grasp of data management principles and technologies to perform requirements analysis, quality assurance and control, as well as data modelling
+ Experience using data analysis and visualization tools such as QlikSense, Superset or Tableau
**If this sounds like you:**
Apply if you think we're a good match.
Discover what makes Bloomberg unique - watch our for an inside look at our culture, values, and the people behind our success.
Bloomberg is an equal opportunity employer and we value diversity at our company. We do not discriminate on the basis of age, ancestry, color, gender identity or expression, genetic predisposition or carrier status, marital status, national or ethnic origin, race, religion or belief, sex, sexual orientation, sexual and other reproductive health decisions, parental or caring status, physical or mental disability, pregnancy or parental leave, protected veteran status, status as a victim of domestic violence, or any other classification protected by applicable law.
Bloomberg is a disability inclusive employer. Please let us know if you require any reasonable adjustments to be made for the recruitment process. If you would prefer to discuss this confidentially, please email
Is this job a match or a miss?
 
            
        
                                            
            
                 
            
        
                    Senior Data Management Professional - Data Engineering - Fixed Income Corporate Actions, Hong Kong
 
                        Posted 5 days ago
Job Viewed
Job Description
Location
Hong Kong
Business Area
Data
Ref #
**Description & Requirements**
Bloomberg runs on data. Our products are fueled by powerful information. We combine data and context to paint the whole picture for our clients, around the clock - from around the world. In Data, we are responsible for delivering this data, news and analytics through innovative technology - quickly and accurately. We apply problem-solving skills to identify innovative workflow efficiencies, and we implement technology solutions to enhance our systems, products and processes - all while providing platinum customer support to our clients.
Our Team:
Our Fixed Income Corporate Actions Data team looks after the lifecycle of corporate securities focusing on transaction level data and their impact on reference data and bond calculations. Corporate Actions can have significant implications for bondholders and often impact across the wider market, making timely and accurate interpretation and capture of corporate actions data key to meet client needs. With the increased adoption of new technologies and industry standards, there is a growing need for automation and standardization of corporate actions, uncovering an opportunity for technical innovation for our team.
What's the role?
As a Senior Data Engineer on our team, you will play a critical role in designing, implementing, and maintaining robust data infrastructure to support Bloomberg's fixed income corporate action datasets. You'll be responsible for building scalable, automated data pipelines and workflows that ensure the seamless flow of data from diverse sources into our systems. You will collaborate closely with Engineering teams, domain experts, and business stakeholders to understand data requirements, identify gaps, and drive improvements across our technology stack. Your work will directly impact the quality and efficiency of our products, and you'll be expected to measure and articulate this impact using business intelligence tools. We're looking for a highly motivated technical leader who is passionate about solving complex data problems, transforming datasets, and mentoring junior team members. You'll be encouraged to lead global initiatives, foster cross-functional collaboration, and bring innovative, resourceful solutions to the table.
We'll Trust You To:
+ Design, develop, and maintain scalable data pipelines and processes that interact with our corporate action database
+ Apply modern technologies to solve problems and optimize workflows in areas such as data quality, acquisition, operations, reliability, and content-working closely with domain experts and technical account managers.
+ Collaborate with internal teams (Engineering, Product, Sales) and external partners to drive strategic product development and execution.
+ Use statistical analysis and data visualization to generate insights on operations and projects, and communicate findings effectively.
+ Incorporate machine learning and statistical techniques to detect anomalies and drive quality improvement in areas such as accuracy, completeness, consistency and reliability
+ Communicate with impact, ensuring sophisticated information is articulated in a meaningful way to wide and varied audiences including senior executives
+ Maintain comprehensive documentation outlining the purpose, design, and technical specifications of services developed by the team.
You'll Need to Have:
+ Strong written and verbal communication skills in both English and Chinese (Mandarin or Cantonese)
+ At least 4 years of experience working with data in the financial industry
+ 3 years of programming experience in a development and/or production environment, with proficiency using tools such as Python for data management and ETL pipeline development.
+ Experience using scripting languages to build preprocessing services integrated into data workflows.
+ Excellent critical thinking and problem-solving abilities, especially in improving and automating data processes.
+ Ability to work independently and thrive in a cross-functional team environment.
+ A proactive mindset with a willingness to challenge the status quo, influence others, and lead change.
+ Solid understanding of data quality best practices to enhance dataset value.
+ Proven experience in project management and a strong customer-focused approach.
+ A track record of continuous career growth and increasing responsibility within an organization.
We'd Love to See:
+ Familiarity with fixed income markets and financial datasets.
+ Experience with data science and machine learning techniques.
+ Background in Agile/Scrum project management methodologies.
+ A proven grasp of data management principles and technologies to perform requirements analysis, quality assurance and control, as well as data modelling
+ Experience using data analysis and visualization tools such as QlikSense, Superset or Tableau
**If this sounds like you:**
Apply if you think we're a good match.
Discover what makes Bloomberg unique - watch our for an inside look at our culture, values, and the people behind our success.
Bloomberg is an equal opportunity employer and we value diversity at our company. We do not discriminate on the basis of age, ancestry, color, gender identity or expression, genetic predisposition or carrier status, marital status, national or ethnic origin, race, religion or belief, sex, sexual orientation, sexual and other reproductive health decisions, parental or caring status, physical or mental disability, pregnancy or parental leave, protected veteran status, status as a victim of domestic violence, or any other classification protected by applicable law.
Bloomberg is a disability inclusive employer. Please let us know if you require any reasonable adjustments to be made for the recruitment process. If you would prefer to discuss this confidentially, please email
Is this job a match or a miss?
 
            
        
                                            
            
                