JOB DESCRIPTIONS :
- Create and Develop data ingestion from various sources : RDBMS, Rest API, Kafka, Text File, and Spread sheet Design, develop, optimize, and maintain data architecture and pipelines
- Work with Core Data Engineering / Data Warehousing team to utilize existing frameworks for the implantation of these data pipelines
- Create and Develop data ingestion from various sources : RDBMS Rest API, Kafka, Text File, and Spread sheet Design, develop, optimize, and maintain data architecture and pipelines
- Drive the prioritization, strategy, and focus to solve user problems
- Maintaining and optimizing Data Pipelines
- Participate in code reviews and follow best practices for development and documentation data Pipelines
- Continuously learn and adapt to new technologies and methodologies within the Data Engineering landscape
REQUIREMENTS :
- Having experiences as a Data Engineer or Data Analyst (min. 4 years)
- Excellent command of programming languages, preferably in Python
- Experience in managing a serverless data warehouse like BigQuery or Redshift
- Familiarity with schedulers like Airflo w & Airbyte
- Have experience working with Github and Docker
- Deep knowledge of SQL database design (MySQL, Redshift, PostgreSQL)
- Understand how to optimize data retrieval and how to develop dashboards, reports, and other visualizations for stakeholders
- Good communication skills to work across departments