Job Description:
- Design and implement scalable, high-performance data architectures that support the organization's data needs.
- Develop and maintain efficient Extract, Transform, Load (ETL) processes for ingesting data from various sources into the data warehouse.
- Manage and optimize databases, data lakes, and other storage solutions to ensure efficient data retrieval and storage.
- Implement data quality checks and ensure data integrity throughout the data lifecycle.
- Collaborate with cross-functional teams, including data scientists, analysts, and business stakeholders, to understand data requirements and deliver solutions.
Requirements:
- Bachelor's or Master's degree in Computer Science, Information Technology, or a related field.
- Proven experience as a Data Engineer or in a similar role.
- Proficient in programming languages such as Python, Java, or Scala.
- Strong experience with data modeling, ETL development, and data warehousing.
- Familiarity with database systems (SQL and NoSQL) and big data technologies.
- Knowledge of cloud platforms (e.g., AWS, Azure, GCP) and their data services.
- Excellent problem-solving and communication skills.