Understand our customers' core business objectives and build end-to-end data centric solutions to address them.
Collaborate with technical business analyst, to understand business requirements, data, and optimize data pipeline performance.
Build solutions in big data and data management tools meeting projects’ requirements.
Construct code with good coding standard and practices to ensure high quality and minimum risks.
Closely working with project manager and technical leads to provide regular status reporting and support them to identify blockers for quick resolution.
Fix defects raised during testing or update mapping/code due to a change request.
Bachelor’s degree and associate/diploma degree.
Computer science/information system background or related field.
Knowledge of Scrum and Agile methodologies.
Experience with Snowflakes or Databricks (is a plus).
Experience with Data Engineering or Big Data Technologies, or Data Transformation, and data modeling using tools like Tableau, Power BI, or matplotlib/seaborn. Experience in architecting and building scalable data platforms.
Experience with Informatica or other related data Integration tools.
Experience with Cloud Technologies (Data Lake, Azure, Google, AWS etc.) or experience with open-source technologies (Spark, Hadoop, Flink Kafka, Presto, Hive, Cassandra etc.)
Experience with my SQL and/or NOSQL databases.
Proficiency in Programming Language like Python or Java for data analysis and machine learning model development.
Ability to manage and structure complex tasks
Ability to solve complex and high-level problems and application issues
Ability to work under pressure
Ability to work within a team
Good communication skills
We regret to inform you that this job opportunity is no longer available