What Jobs are available for Employee Data in Hong Kong?
Showing 1206 Employee Data jobs in Hong Kong
Data Analysis Specialist
Posted today
Job Viewed
Job Description
Job Responsibilities
- Assist in building data models and the company's data platform, optimizing analysis processes, and promoting data standardization.
- Perform data analysis and reporting to support product strategy and customer management.
- Responsible for the collection, analysis, and maintenance of product-related data.
Job Requirements
- 1-3 years of work experience; master's degree or above.
- Strong data analysis and processing skills, proficient in using common data analysis tools such as SQL and Python.
- Clear thinking, logical rigor, attention to detail, strong sense of responsibility, and strong data insight ability.
Is this job a match or a miss?
Manager, AI Model Development and Data Analysis
Posted today
Job Viewed
Job Description
Responsibility:
- Develop, train and deploy artificial intelligence and machine learning models to solve complex business challenges;
- Responsible for various data preprocessing activities including collection, cleansing and validation of large datasets, and transforming the data into format suitable for consumption by artificial intelligence and machine learning models;
- Conduct in-depth data analysis on large datasets, and apply statistical and machine learning algorithms for extracting meaning from data and identifying actionable insights;
- Continuously monitor model performance and iterate to improve performance and efficiency of models with a view to optimizing business function;
- Responsible for preparation and maintenance of documentations in relation to the model development throughout the model development lifecycle;
- Assist in building a compliance data warehouse and continuously expanding data scope according to the need of AML monitoring, optimizing database performance and improving data usage experience;
- Assist in management and maintenance of the compliance data warehouse, including formulating database management policy and procedure, developing and maintaining data dictionary, monitoring data quality and ensuring data security;
- Assist in data management within the department, including data accountability management, data governance, data quality and security, etc.;
- Collaborate closely with cross-functional teams and external parties including business stakeholders, inhouse IT and data scientist team and solution vendors;
- Research emerging artificial intelligence trends and models, and integrate the new techniques to enhance model capabilities
Requirements:
- Bachelor degree or above in Computer Science, Data Science, Financial Mathematics, Statistics, Finance, Economics, or related disciplines
- 3-5 years or above relevant experience in implementation of artificial intelligence and machine learning models; experience in Banking data management, analysis and application being an advantage (For Senior Manager, at least 8 years relevant experience)
- Proficiency in programming languages, including Python and R, and machine learning libraries
- Strong understanding of machine learning algorithms, deep learning and statistical methods
- Strong problem-solving ability with passion for analytic excellence, able to manage complex scenario challenges simultaneously and work independently and under pressure
- Experience in financial service industry or financial crime compliance domain, or possession of relevant qualification in CAMS, ECF (AML/CFT) Core Level, FRM, CPA, ACCA being an advantage
- Good command of written and spoken English and Chinese
Is this job a match or a miss?
AML Manager, AI development and data analysis
Posted today
Job Viewed
Job Description
Responsibility:
- Develop, train and deploy artificial intelligence and machine learning models to solve complex business challenges;
- Responsible for various data preprocessing activities including collection, cleansing and validation of large datasets, and transforming the data into format suitable for consumption by artificial intelligence and machine learning models;
- Conduct in-depth data analysis on large datasets, and apply statistical and machine learning algorithms for extracting meaning from data and identifying actionable insights;
- Continuously monitor model performance and iterate to improve performance and efficiency of models with a view to optimizing business function;
- Responsible for preparation and maintenance of documentations in relation to the model development throughout the model development lifecycle;
- Assist in building a compliance data warehouse and continuously expanding data scope according to the need of AML monitoring, optimizing database performance and improving data usage experience;
- Assist in management and maintenance of the compliance data warehouse, including formulating database management policy and procedure, developing and maintaining data dictionary, monitoring data quality and ensuring data security;
- Assist in data management within the department, including data accountability management, data governance, data quality and security, etc.;
- Collaborate closely with cross-functional teams and external parties including business stakeholders, inhouse IT and data scientist team and solution vendors;
- Research emerging artificial intelligence trends and models, and integrate the new techniques to enhance model capabilities
- Bachelor degree or above in Computer Science, Data Science, Financial Mathematics, Statistics, Finance, Economics, or related disciplines
- 3-5 years or above relevant experience in implementation of artificial intelligence and machine learning models; experience in Banking data management, analysis and application being an advantage (For Senior Manager, at least 8 years relevant experience)
- Proficiency in programming languages, including Python and R, and machine learning libraries
- Strong understanding of machine learning algorithms, deep learning and statistical methods
- Strong problem-solving ability with passion for analytic excellence, able to manage complex scenario challenges simultaneously and work independently and under pressure
- Experience in financial service industry or financial crime compliance domain, or possession of relevant qualification in CAMS, ECF (AML/CFT) Core Level, FRM, CPA, ACCA being an advantage
- Good command of written and spoken English and Chinese
Is this job a match or a miss?
Data Engineer, Data
Posted today
Job Viewed
Job Description
We offer work from home (Max. 2 days per week), 14-20 days' annual leave, double pay, discretionary bonus, overtime pay, medical/dental/life insurance, five-day work week.
As a Data Management Engineer, you will play a critical role in ensuring the integrity, security, and efficiency of our data platform. You will collaborate closely with cross-functional teams to implement governance frameworks, enforce data standards, and optimize resource usage. Your work will directly support the organization's data strategy and compliance posture.
Job Description :
Lead the design, implementation and deployment of a master data management architecture that encompasses all customer source systems to enable data sharing across different regions, business units and departments
Operationalize Enterprise Master Data Repository, to enforce centralized governance controls at a global level
Identify and build data quality rules, investigate and remediate data quality issues
Design and build data quality dashboards with Power BI
Evaluate, select and implement appropriate data management technologies to address data governance challenges
Manage vendors to complete data governance activities, from vendor selection, data discovery, Proof of Concept (PoC) development, implementation to global adoption
Design and implement data governance solutions that incorporate AI-driven data management techniques to improve data quality and enhance data governance processes
Monitor data platform resource utilization and performance metrics
Identify and recommend opportunities for cost optimization and operational efficiency
Lead analysis of the current data platforms (e.g., logs) to detect critical deficiencies and recommend solutions for improvement
Engage with key data stakeholders to outline data objectives and gather data requirements. Execute solutions encompassing ownership, accountability, streamlined processes, robust procedures, stringent data quality measures, security protocols, and other pertinent areas to drive successful implementation
Implement the Architecture Governance Standard, Platform Design Principles, Platform Security, and Data Compliance Standard
Implement the Data Classification Standard to enhance data management and security measures within the organization
Take charge of the Global Data Quality Forum and establish regional forums if required to foster collaboration and knowledge sharing on data quality practices
Conduct market research and collaborate with vendors to evaluate cutting-edge data management technologies, trends, and products. Select and deploy the most suitable solutions for Global Data and Analytics Governance initiatives, ensuring seamless scalability
Requirement:
Bachelor's degree from a recognized university in Computer Science, Information Engineering, or related field
At least 6 years of experience in Data Engineering, IT, Data Governance, Data Management or related field
Knowledge of data management best practices and technologies
Knowledge of data governance, security and observability
Proven ability to identify innovation opportunities and deliver innovative data management solutions
Hands-on experiences in SQL, Python, and PowerBI
Experience in Azure Databricks Unity Catalog and DLT
Excellent analytical and problem-solving skills
Fluent in English speaking and writing
Willingness to travel, as needed
The requirements below are considered as advantages, but not a must .
Knowledge of data related regulatory requirements and emerging trends and issues
Experience in programming languages including PySpark, R, Java, Scala
Experience in working with cross-functional teams in global settings
Interested parties please send full resume with employment history and expected salary to HRA Department, Yusen Logistics Global Management (Hong Kong) Limited by email.
Yusen Logistics Global Management (Hong Kong) Limited is an equal opportunity employer. All information collected will be used for recruitment purpose only.
<<
About Yusen Logistics
Yusen Logistics is working to become the world's preferred supply chain logistics company. Our complete offer is designed to forge better connections between businesses, customers and communities – through innovative supply chain management, freight forwarding, warehousing and distribution services. As a company we're dedicated to a culture of continuous improvement, ensuring everyone who works with us is committed, connected and creative in making us the world's preferred choice.
Is this job a match or a miss?
Data Management Engineer, Data
Posted today
Job Viewed
Job Description
5-day work, 14-20 days AL, Double Pay, Discretionary Bonus, Group medical insurance
Job Scope of Position:Improve data management across Yusen Logistics group data platforms
This job is for whom Interest in innovative & latest Data Engineering. And the talent who is with proven ability to identify innovation opportunities and deliver innovative data management solutions
Job Description :SSOT & Master Data Management Implementation:
Lead the design, implementation and deployment of a Single Source of Truth (SSOT) architecture to enable data sharing across different regions, business units and departments
- Operationalize the SSOT, serving as an Enterprise Centralized Data Repository, to enforce essential governance controls at a global level
- Manage master data and implement Customer 360
Data Quality Management:
Engage with key data stakeholders to outline data objectives and gather data requirements. Execute solutions encompassing ownership, accountability, streamlined processes, robust procedures, stringent data quality measures, security protocols, and other pertinent areas to drive successful implementation
- Identify and build data quality rules, investigate and remediate data quality issues
- Design and build data quality dashboards with Power BI
Data Projects:
Collaborate with data analysts, data stewards and business stakeholders to understand their data requirements and translate them into technical specifications
- Manage vendors to complete data governance activities, from vendor selection, data discovery, Proof of Concept (PoC) development, implementation to global adoption
- Conduct market research and collaborate with vendors to evaluate cutting-edge data management technologies, trends, and products. Select and deploy the most suitable solutions for Global Data and Analytics Governance initiatives, ensuring seamless scalability
- Data Forum: Organize Global Data Quality Forum and Global Data Architecture Review Forum
Requirement:
- Bachelor's degree from a recognized university in Computer Science, Information Engineering, or related field
- At least 3 years of experience in Data Engineering, or related field
- Knowledge of data management best practices and technologies
- Proven ability to identify innovation opportunities and deliver innovative data management solutions
- Hands-on experiences in SQL and Power BI
- Fluent in English speaking and writing
- Excellent analytical and problem-solving skills
- Energetic, proactive adaptive problem solver, organized, flexible and comfortable navigating through ambiguity
- Sometimes Business Travel is needed
Interested parties please send full resume with employment history and expected salary to HRA Department, Yusen Logistics Global Management (Hong Kong) Limited by email.
Yusen Logistics Global Management (Hong Kong) Limited is an equal opportunity employer. All information collected will be used for recruitment purpose only.
<<
About Yusen Logistics
Yusen Logistics is working to become the world's preferred supply chain logistics company. Our complete offer is designed to forge better connections between businesses, customers and communities – through innovative supply chain management, freight forwarding, warehousing and distribution services. As a company we're dedicated to a culture of continuous improvement, ensuring everyone who works with us is committed, connected and creative in making us the world's preferred choice.
Is this job a match or a miss?
Senior / Data Analyst (Data Management - External Data)
Posted today
Job Viewed
Job Description
Responsibilities :
- Handle, integrate and govern external data procurement requests
- Lead and support external data management activities in the external data acquisition process.
- Support vendors management (service quality and performance monitoring)
- Document external data sources, usage and other information to support management reporting
- Support other data management tasks as assigned
Requirements :
- Degree or above in finance, computer science, IT management, statistics, business administration, project management or relevant disciplines
- Minimum 3 years' working experience, preferably in data management particularly in external data
- Experience in external data procurement would be desirable
- Excellent communication skill and with team-work spirit
- Have can-do attitude and attentive to details
- Familiar with Microsoft Office (including Chinese typing)
Candidates with less experience will be considered as Associate Data Analyst
Is this job a match or a miss?
Senior Data Analyst/ Data Analyst
Posted today
Job Viewed
Job Description
The Senior Data Analyst/ Data Analyst plays a key role in driving data-driven decision-making and continuous improvement across supply chain and logistics operations. This role combines technical expertise in analytics, system optimization, and process improvement with business acumen and project leadership to enhance supply chain visibility, performance, and automation.
What's in it for you:
1. Supply Chain Data Strategy & Architecture
- Design and maintain scalable data architecture for sourcing and logistics systems.
- Ensure integrity, connectivity, and automation of data flows between SAP, OTM/TMS, and BI (Qlik-Sense) tools.
- Develop data quality governance and validation frameworks.
2. Advanced Analytics & Insight Generation
- Lead development of predictive and diagnostic analyses to identify process inefficiencies.
- Translate complex datasets into actionable insights and recommendations for management.
- Develop and maintain performance dashboards and KPIs (Qlik Sense, Power BI, etc.).
3. Digital Tool Development & Automation
- Build and enhance analytical tools using Python, SQL, and other technologies to support decision-making.
- Develop RPA or automation schedulers to streamline manual tasks.
- Oversee integration of internal and external systems (e.g. Forwarder, Wakeo, Oracle OTM).
4. Process Optimization & Business Partnership
- Partner with Sourcing Supply Chain and Transportation Operations teams to identify improvement areas.
- Lead data-driven continuous improvement initiatives and contribute to OTM/TMS system enhancements.
- Act as key liaison between business teams and IT for technical solution deployment.
5. Project Management & Leadership
- Lead cross-functional data and system improvement projects from scoping to deployment.
- Manage project timelines, deliverables, and stakeholder alignment to ensure successful implementation.
- Support management in defining digital transformation roadmaps and long-term data strategy.
- Mentor and coach junior analysts to build a data-driven and continuous improvement culture.
Who we are looking for:
- Degree in Data Science, Computer Science, Supply Chain, or related discipline.
- 3+ years' experience in data analytics, process improvement or logistics systems.
- Strong command of Python, SQL and BI tools (Qlik Sense / Power BI).
- Advanced Excel, VBA, or automation scripting (RPA, Power Automate…etc.).
- Knowledge of SAP is an advantage.
- Solid business understanding of sourcing, transportation, and warehouse operations.
- Excellent communication, project management and stakeholder alignment skills.
- Fluent in English and Chinese.
We offer attractive remuneration package to the right candidate including 5-day work, double pay, discretionary bonus, family leave & birthday leave, medical, dental, life insurance scheme, etc. Interested parties please send your full resume with detailed working experience, availability, current and expected salary to Human Resources Department by clicking the apply button.
Company Overview
Groupe SEB, the world leader in small household equipment, with operations in almost 150 countries, over 8 Billion euros sales and earned a strong position on all continents through the diversified product ranges and multi-brand strategy consisting of world famous brands (Tefal, Lagostina, WMF, Krups, Rowenta, Moulinex….)and local brands. Multi-cultural and multi-creative, Groupe SEB has 34,000 employees in 60 countries, share the same values and commitment to sustainable development, the same sense of professionalism and passion for innovation.
SEB Asia, our Hong Kong Office comprises a regional team for Asia Pacific, strategic marketing for global categories, commercial team for Hong Kong and Southeast Asia markets and also our Group Asia Sourcing platform to develop new products offers for our global markets.
To fuel its sustainable growth and long-term strategy, we are inviting talented and energetic individual to join our professional team.
Is this job a match or a miss?
Be The First To Know
About the latest Employee data Jobs in Hong Kong !
Senior Data Analyst/ Data Analyst
Posted today
Job Viewed
Job Description
The Senior Data Analyst/ Data Analyst plays a key role in driving data-driven decision-making and continuous improvement across supply chain and logistics operations. This role combines technical expertise in analytics, system optimization, and process improvement with business acumen and project leadership to enhance supply chain visibility, performance, and automation.
What's in it for you:
1. Supply Chain Data Strategy & Architecture
- Design and maintain scalable data architecture for sourcing and logistics systems.
- Ensure integrity, connectivity, and automation of data flows between SAP, OTM/TMS, and BI (Qlik-Sense) tools.
- Develop data quality governance and validation frameworks.
2. Advanced Analytics & Insight Generation
- Lead development of predictive and diagnostic analyses to identify process inefficiencies.
- Translate complex datasets into actionable insights and recommendations for management.
- Develop and maintain performance dashboards and KPIs (Qlik Sense, Power BI, etc.).
3. Digital Tool Development & Automation
- Build and enhance analytical tools using Python, SQL, and other technologies to support decision-making.
- Develop RPA or automation schedulers to streamline manual tasks.
- Oversee integration of internal and external systems (e.g. Forwarder, Wakeo, Oracle OTM).
4. Process Optimization & Business Partnership
- Partner with Sourcing Supply Chain and Transportation Operations teams to identify improvement areas.
- Lead data-driven continuous improvement initiatives and contribute to OTM/TMS system enhancements.
- Act as key liaison between business teams and IT for technical solution deployment.
5. Project Management & Leadership
- Lead cross-functional data and system improvement projects from scoping to deployment.
- Manage project timelines, deliverables, and stakeholder alignment to ensure successful implementation.
- Support management in defining digital transformation roadmaps and long-term data strategy.
- Mentor and coach junior analysts to build a data-driven and continuous improvement culture.
Who we are looking for:
- Degree in Data Science, Computer Science, Supply Chain, or related discipline.
- 3+ years' experience in data analytics, process improvement or logistics systems.
- Strong command of Python, SQL and BI tools (Qlik Sense / Power BI).
- Advanced Excel, VBA, or automation scripting (RPA, Power Automate…etc.).
- Knowledge of SAP is an advantage.
- Solid business understanding of sourcing, transportation, and warehouse operations.
- Excellent communication, project management and stakeholder alignment skills.
- Fluent in English and Chinese.
We offer attractive remuneration package to the right candidate including 5-day work, double pay, discretionary bonus, family leave & birthday leave, medical, dental, life insurance scheme, etc. Interested parties please apply with full resume with expected salary and current salary to Human Resources Department.
Is this job a match or a miss?
Data Engineer/ Senior Data Engineer
Posted today
Job Viewed
Job Description
Our client is now looking for talents to join their team:
Goal of the Project
l Implement a scalable cloud-based data platform to enable sustainable, reusable, secure, efficient ingestion, transformation, and storage of enterprise data across multiple sources.
l Improve data consistency, reliability and controlled accessibility by automating ETL/ELT workflows, enforcing data quality checks, and reducing manual intervention by 30–40%.
l Knowledge and experience of near real-time analytics capabilities to support business intelligence, predictive modeling, and faster decision-making for strategic initiatives.
Primary Responsibilities (Guideline: List core duties in at least 3-5 bullet points)
l Optimize, operate at scale and enhance cloud-based data platform on Microsoft Azure (incl DataBricks)
l Involve in cloud data platform enhancement project lifecycle with external vendor including design, development, testing, deployment, and documentation.
l Work with Data Analyst to perform data preparation, cleansing, build optimized data models and visualization in Power BI
l Perform proof of concept for cloud data products
Secondary Responsibilities (Guideline: List supporting tasks in bullet points)
l Validate data workflows and integration points through rigorous testing (e.g., pipeline validation, schema checks, and performance benchmarking) to ensure solutions meet business and technical requirements.
l Create operational playbooks and automation scripts to streamline deployment, monitoring, and troubleshooting, enabling efficient handover and long-term maintainability of data solutions.
l Document data pipeline architecture, deployment processes, and operational runbooks to support effective troubleshooting and post-go-live maintenance by internal teams
Additional Beneficial Data Knowledge:
l ERP
l CRM
l Marketing
l Asset Management
l Construction
Requirements:
● Bachelor's degree in computer science, Information Technology, Data Science or related field. Related Certification in Cloud Data platform
● Fluency in English.
● years of professional experience related to data platform, especially familiar with data engineer side. Experience in Python, SQL, Spark, Azure Data Factory, Event Hub / Kafka / Event Messaging, Azure Data Lake Storage Gen2, Databricks, Unity Catalog, Power BI
● Experience in software development lifecycle (SDLC).
● Understanding of data lifecycle and governance principles (e.g., data quality, lineage, security, and compliance).
Advantages experience (Not a must to have)
l Databricks Certified Data Engineer Associate or Professional
l Prior work on scalable Data Platform implementations or data platform migrations (or 2-3 project-specific examples).
l Exposure to data monetization or advanced analytics use cases (predictive modeling, AI/ML pipelines).
l Knowledge of Data & Analytics Multi Tenancy, Data Lakehouse platform architecture, DataOps, MLOps, AIOps, ModelOps and FinOps.
l Contribute as a strong single contributor and have ability to manage junior data engineers.
l Ability to manage engineering tasks and contribute to planning for successful execution.
Is this job a match or a miss?
Manager, Data Engineer, Data Transformation
Posted today
Job Viewed
Job Description
Prudential's purpose is to be partners for every life and protectors for every future. Our purpose encourages everything we do by creating a culture in which diversity is celebrated and inclusion assured, for our people, customers, and partners. We provide a platform for our people to do their best work and make an impact to the business, and we support our people's career ambitions. We pledge to make Prudential a place where you can Connect, Grow, and Succeed.
Job Responsibilities
- Design, build, and maintain scalable and efficient ETL/ELT pipelines in Azure Databricks to process structured, semi-structured, and unstructured insurance data from multiple internal and external sources.
- Collaborate with data architects, modellers, analysts, and business stakeholders to gather data requirements and deliver fit-for-purpose data assets that support analytics, regulatory, and operational needs.
- Develop, test, and optimize data transformation routines, batch and streaming solutions (leveraging tools such as Azure Data Factory, Data Lake Storage Gen2, Azure Event Hubs, and Kafka) to ensure timely and accurate data delivery.
- Implement rigorous data quality, validation, and cleansing procedures—with a focus on enhancing reliability for high-stakes insurance use cases, reporting, and regulatory outputs.
- Integrate Informatica tools to facilitate data governance, including the capture of data lineage, metadata, and data cataloguing as required by regulatory and business frameworks.
- Ensure robust data security by following best practices for RBAC, managed identities, encryption, and compliance with Hong Kong's PDPO, GDPR, and other relevant regulatory requirements.
- Automate and maintain deployment pipelines using GitHub Actions to ensure efficient, repeatable, and auditable data workflows and code releases.
- Conduct root cause analysis, troubleshoot pipeline failures, and proactively identify and resolve data quality or performance issues.
- Produce and maintain comprehensive technical documentation for pipelines, transformation rules, and operational procedures to ensure transparency, reuse, and compliance.
- Apply subject matter expertise in Hong Kong Life and General Insurance to ensure that development captures local business needs and industry-specific standards.
Job Requirements
- Bachelor's degree in Information Technology, Computer Science, Data Engineering, or a related discipline.
- 6+ years of experience as a data engineer, building and maintaining ETL/ELT processes and data pipelines on Azure Databricks (using PySpark or Scala), with a focus on structured, semi-structured, and unstructured insurance data.
- Strong experience orchestrating data ingestion, transformation, and loading workflows using Azure Data Factory and Azure Data Lake Storage Gen2.
- Advanced proficiency in Python and Spark for data engineering, data cleaning, transformation, and feature engineering in Databricks for analytics and machine learning.
- Experience integrating batch and streaming data sources via Kafka or Azure Event Hubs for real-time or near-real-time insurance applications.
- Hands-on use of Informatica for data quality, lineage, and governance to support business and regulatory standards in insurance.
- Familiarity with automation and CI/CD of Databricks workflows using GitHub Actions.
- Understanding of data security, RBAC, Key Vault, encryption, and best practices for compliance in the insurance sector.
- Experience optimizing data pipelines to support ML workflows and BI/reporting tools.
- Excellent command of English and Chinese – both written and spoken.
Prudential is an equal opportunity employer. We provide equality of opportunity of benefits for all who apply and who perform work for our organisation irrespective of sex, race, age, ethnic origin, educational, social and cultural background, marital status, pregnancy and maternity, religion or belief, disability or part-time / fixed-term work, or any other status protected by applicable law. We encourage the same standards from our recruitment and third-party suppliers taking into account the context of grade, job and location. We also allow for reasonable adjustments to support people with individual physical or mental health requirements.
Is this job a match or a miss?