Technologies
Business Intelligence Jobs and Data Recruitment
Location
Paris, France
Type
Contract
GCP Data Engineer
France
See below more insights -
Design and implement data migration pipelines from on-premise systems (e.g., Oracle, SQL Server, Hadoop) to Google Cloud Platform (BigQuery, Cloud Storage, Cloud SQL).
Automate large-scale ETL/ELT workflows using Dataflow, Dataproc, and Cloud Composer (Airflow) to ensure seamless data ingestion and transformation.
Implement data partitioning, clustering, and optimization strategies in BigQuery to improve query performance and cost efficiency.
Data pre-processing, handle the data pipelines that feed the predictive models
Developed CI/CD pipelines for data workflows using Cloud Build, GitLab CI, or Terraform, ensuring automated deployment and reproducibility
Supporting existing team throughout the Migration
Requirements -
Min 3-5 years as Data Engineer
Experience within GCP (BigQuery, DataProc, Workflows, Cloud Run)
Exposure to Terraform, Github Actions
Big Data Experience - Mainly working with Spark
Additional knowledge and experience across Java, Hadoop & MongoDB