Senior Officer, Finance system management (40000346)
Job Purpose
Create data control checkpoint, analyse data flow of systems affecting management report, resolution to utilize system deploitment (DWH, KRM, OFSAA, BI…) to operate multi-dimension management report.
Key Accountabilities (1)
- Operate on DataLake system via Databirck with daily/monthly/periodic management reports
- Work with BA team to develop, build technical documents and proceed to build and automate management reports suitable to the needs of business units as well as management departments of the bank or ad-hoc according to business requirements
- Set up ODI flows according to new requirements and build/update operational documents "
- Work with BA team to improve and upgrade technically the data sets that have been/are being set up according to analysis/user needs
- Develop and design the structure of the management information system (within the allowed area). Understand the data flow from the source (from other transaction systems/sources) to the data lake (DataLake), understand the logic of the data to set up internal data workflow and ensure consistency and synchronization of data.
- Evaluate and identify opportunities to improve the process to increase efficiency and data quality data/reporting from a technical perspective
- Participate in the process of assessing and controlling the risks of data sources on reports, analyzing the impact of factors when there are changes in policies and requirements on data sources before reporting to Management from a technical perspective
- Design, build and maintain Big Data processing systems and ETL pipelines effectively and stably.
- Cooperate with data analysis, data science and software engineering teams to ensure that data is stored, processed and retrieved optimally.
- Optimize data system performance, ensure security and compliance with data regulations.
- Participate in projects from time to tim
Technicals
Cloud & Infrastructure: AWS (EKS, Lambda, SQS, Kinesis), Terraform
Data Platforms: Databricks, Apache Spark, Flink, Delta Lake, Iceberg
Databases:DynamoDB, Oracle, Oracle GoldenGate (OGG), ...
Governance & Cataloging: Unity Catalog, Collibra
Languages: Python, Java/Scala, SQL
Other Tools: GoAnywhere Managed MFT
Success Profile - Qualification and Experiences
- Graduate from University or higher in the following majors: IT, Mathematics, Computer Science,...
- Have 1-3 years of experience in exploiting data from database management systems such as: DynamoDB, Oracle, Oracle GoldenGate (OGG), ... or big data processing tools such as Databick, AWS Glue
- Understand the concept of data lake (DataLake) and be able to work with data from different sources effectively.
- Experience working with data technologies such as Spark, Hadoop, Kafka, Airflow, or equivalent
- Preference is given to candidates with experience deploying on Cloud platforms (AWS, GCP, Azure).
- Love to learn data systems, develop and improve performance.
- TOIEC or equivalent: for Specialist: 450, for CC Specialist: 550
- Logical thinking, Proactive, Careful. Responsible
- Stable job for at least 2 years