Średnia wypłata: zł25 500 /miesięczne
Więcej statystykOdbieraj najlepsze oferty pracy na swój adres mailowy
- ...for analytics and reporting, You will develop and optimize ETL / ELT pipelines using Databricks (Apache Spark, SQL, Python) and Delta Lake technologies, You will be responsible for data modelling, data structures and performance optimization in analytical data stores...ZasugerowanePraca hybrydowaPraca zdalnaElastyczne godziny
15120 - 21000 zł
...Spark/PySpark, for simplified data access and governance. Secure integration solutions and enhanced data quality monitoring, utilizing Delta Live Table tests, established trust in the platform. The intermediate result is a user-friendly, secure, and data-driven platform,...ZasugerowanePraca zdalnaPełny etatElastyczne godziny8400 - 15120 zł
...Spark/PySpark, for simplified data access and governance. Secure integration solutions and enhanced data quality monitoring, utilizing Delta Live Table tests, established trust in the platform. The intermediate result is a user-friendly, secure, and data-driven platform,...ZasugerowanePraca zdalnaPełny etatElastyczne godziny110 - 165 zł / stawka godzinowa
...into actionable insights Strong verbal and written English communication skills Nice to have: Databricks, familiarity with Delta Lake / Medallion Architecture Monitoring/logging tools (Azure Monitor, Log Analytics) At Godel Technologies, we are passionate...ZasugerowanePełny etat150 - 200 zł / stawka godzinowa
...Optimize Spark workloads by applying join strategies, shuffle optimization, caching, and partitioning techniques , Design and maintain Delta Lake architectures, including schema evolution, ACID transactions, and performance tuning , Implement data governance and access...ZasugerowanePełny etat- ...Build and optimize data transformations using PySpark and SQL in Databricks Implement and maintain Lakehouse architectures using Delta Lake Develop ETL/ELT pipelines orchestrated through Azure Data Factory Integrate data from multiple sources into the data platform...ZasugerowanePraktykiPełny etatPraca zdalnaPraca wieczorami
- ...~3-5 years of experience building or optimizing cloud-based lakehouse platforms with Databricks. ~ Practical experience with Delta Lake , Unity Catalog, Workflows, Databricks SQ L, and Notebooks. ~ Experience with Azure-based deployments of Databricks ( other...ZasugerowanePełny etatPraca hybrydowaPraca zdalna
- ...record of leading teams and delivering end-to-end data engineering projects. ~ Deep expertise in Databricks, PySpark, SQL, and Delta Lake. ~ Strong knowledge of cloud data ecosystems (Azure Data Lake, AWS S3, or GCP BigQuery). ~ Experience with data modeling,...ZasugerowanePełny etatPraca hybrydowaPraca zdalna
- ...Wroclaw. Migrate and optimize data using Databricks optimization techniques Manage big data Tune models for Azure with Java Spark and Delta tables Address vulnerabilities through library version updates 3+ years of proven experience with Python Bachelor’s or Master’s...ZasugerowanePraca hybrydowaElastyczne godziny
- ...Python basics (working with data, notebooks, libraries like Pandas) Understanding of ETL/ELT concepts, data lake/lakehouse, Parquet/Delta formats Ability to work with Databricks (notebooks, jobs) and/or willingness to learn through training platforms Git basics (...ZasugerowanePraktykiPełny etatElastyczne godziny
- ...Spark/PySpark, for simplified data access and governance. Secure integration solutions and enhanced data quality monitoring, utilizing Delta Live Table tests, established trust in the platform. The intermediate result is a user-friendly, secure, and data-driven platform,...ZasugerowanePraca zdalnaElastyczne godziny
- ...Manage and process 12 TB of data efficiently across platforms Tune machine learning models for Azure environments using Java Spark and Delta tables Update and maintain libraries to address security vulnerabilities Develop and maintain ETL/ELT pipelines using PySpark and...Zasugerowane

