Job Description

  • Design, implement and maintain data architecture, databases and data connections (on-site premise/cloud environment)
  • Develop, implement and maintain ETL (extract, transform and load) processes
  • Design conceptual and logical data models and flowchart
  • Provide professional advice for data architecting and designs
  • Migrating data from legacy system to new solutions
  • Analyse structural requirements for new integration
  • Define security and backup procedures
  • Responsible for designing, implementing and maintaining data architecture
  • Develop, implement and maintain API connections for multiple data sources


Qualification

  • Bachelor’s Degree in Computer Science, Information Technology, Computer Engineering or any related field
  • Experience in database programming (SQL), and Business Intelligence tools (Power BI, Tableau) is a must.
  • Experience in working in cloud implementations and integrating cloud services for migration of legacy environment
  • Experience in big data projects using Sparks, Python, Hive, PHP, Bash/Shell Scripting, Hadoop is optional
  • Experience in API developments
  • At least 1-2 years of experience. Fresh graduates are welcomed.