- Develop ETL/ELT jobs to ingest, transform, and load data from multiple source systems into Data Warehouse / Data Lake platforms based on approved analysis and design documents.
- Analyze and design data storage, data integration, and data processing mechanisms to ensure scalability, performance, and data quality.
- Build and maintain detailed technical specifications, deployment guides, and operation manuals.
- Prepare deployment packages and perform deployments across environments (DEV, UAT, PRODUCTION), including deployment validation and checklist verification.
- Optimize system performance through tuning, refactoring, and upgrading ETL/ELT jobs and data pipelines.
- Investigate, troubleshoot, and resolve data pipeline issues, incidents, and operational problems.
- Collaborate with related teams (Data Architecture, BI, Application, Infrastructure) to ensure smooth data integration and operation.
- Perform other tasks as assigned by management.



TT. Quản trị và Phân tích dữ liệu, Hội sở - CVCC Kiến trúc và nền tảng dữ liệu
Ngân Hàng TMCP Phương Đông - OCB
Tòa nhà The Hallmark, số 15 Trần Bạch Đằng, phường Thủ Thiêm, thành phố Thủ Đức, thành phố Hồ Chí Minh
Posted date:
Experience
3 - 5 Years
Job level
Experienced (Non - Manager)
Salary
Job Descriptions
Job Requirement
- Bachelor’s degree in Computer Science, Data Science, Information Systems, or equivalent practical experience.
- Solid background in Data engineering best practices.
- Strong knowledge of Big Data concepts and architectures.
- Hands-on experience with Databricks (Cloud-based) or Oracle (DWH) for data processing and analytics (mandatory requirement).
- Experience in relational database development and optimization, including Oracle, SQL Server, MySQL, and DB2 (DB2 is highly preferred).
- Strong understanding of Data Warehouse, Data Modeling, Data Mart, Data Lake, and database design principles.
- Hands-on experience in building and maintaining ETL/ELT pipelines, especially using Oracle Data Integrator (ODI) or Airflow.
- Experience with Cloud platforms (AWS / Azure / GCP) and cloud-based data architectures.
- Experience with Agile Software Development, with a solid understanding of Agile principles, Scrum methodology, and collaborative delivery models.
- Strong analytical thinking, attention to detail, and problem-solving mindset.
- Team player with a proactive attitude and willingness to continuously learn and self-develop.
Nice to Have (Strong Plus):
- Experience or knowledge of IBM Banking Data Model.
- Understanding and hands-on exposure to DataOps practices, including CI/CD for data pipelines, monitoring, logging, and data quality automation.
More Information
- Degree: Bachelor
- Age: Unlimited
- Type of employment: Permanent
You should be skill
Apply for:
Your Contact Information

