תיאור תפקיד:
Design, build, and maintain robust data pipelines that connect diverse data sources, including databases, APIs, cloud platforms, and third-party systems. You will be responsible for data extraction, transformation, and loading (ETL) processes, ensuring data accuracy, consistency, and reliability Collaborate with cross-functional teams to architect and optimize data storage and retrieval systems. Implement data warehousing solutions that support efficient data querying and analysis while adhering to best practices in data modeling
דרישות:
-Familiarity with data integration tools and cloud-based data platforms, such as Kafka, Spark– a must -Proficiency in programming languages like Python for data manipulation and integration tasks - a must -Bachelor’s or Master’s degree in Computer Science, Data Science, Information Systems, or a related field -Proven experience as a Data Engineer, with a strong focus on data integration and ETL processes -Strong understanding of data modeling concepts and database technologies such as SQL, NoSQL, and Big Data solutions -Knowledge of data governance, data security, and data privacy principles
היקף משרה:
משרה מלאה
קוד משרה:
12486
אזור:
מרכז - תל אביב, פתח תקווה, רמת גן וגבעתיים, בקעת אונו וגבעת שמואל, חולון ובת-ים
השפלה - ראשון לציון ונס- ציונה
|