JOB DESCRIPTIONS:
- Create and Develop data ingestion from various sources: RDBMS, Rest API, Kafka, Text File, and Spread sheet Design, develop, optimize, and maintain data architecture and pipelines
- Work with Core Data Engineering / Data Warehousing team to utilize existing frameworks for the implantation of these data pipelines
- Create and Develop data ingestion from various sources: RDBMS Rest API, Kafka, Text File, and Spread sheet Design, develop, optimize, and maintain data architecture and pipelines
- Drive the prioritization, strategy, and focus to solve user problems
- Maintaining and optimizing Data Pipelines
- Participate in code reviews and follow best practices for development and documentation data Pipelines
- Continuously learn and adapt to new technologies and methodologies within the Data Engineering landscape
REQUIREMENTS:
- Having experiences as a Data Engineer or Data Analyst (min. 4 years)
- Excellent command of programming languages, preferably in Python
- Experience in managing a serverless data warehouse like BigQuery or Redshift
- Familiarity with schedulers like Airflow & Airbyte
- Have experience working with Github and Docker
- Deep knowledge of SQL database design (MySQL, Redshift, PostgreSQL)
- Understand how to optimize data retrieval and how to develop dashboards, reports, and other visualizations for stakeholders
- Good communication skills to work across departments
- WFO: Lion Parcel Head Office (Kedoya, West Jakarta)
- Python
- SQL Server
- Airflow
- Redshift
At Lion Parcel, we are collaborative, fast paced, innovative, open and progressive.
Benefits and perks of working with us include:
Compensation: Competitive salaries
Family benefits: Paid maternity / paternity leave
Lifestyle: Casual dress code, Company outings, Flexible hours, Free food, Hybrid
Progression: Professional development
Welfare: Employee discounts, Health insurance, Paid sick days