Job Description:
1. Design, develop, and maintain a robust and efficient data architecture
2. Develop and implement an optimized database schema, and ensure data integrity and quality
3. Build and manage data pipelines to efficiently flow data from source to destination,
4. Collaborate with data scientists, analysts and development teams
Requirements:
• Bachelor Degree
• Minimum GPA 3.00
• Data Modelling, ETL process, Big Data Technologies (Hadoop, Spark, NoSQL), Programming Language (Python), Data Governance, Data Security
• Experience: Junior (1-2 years of working experience)
• Working system: Hybrid