Master information
GCP Data Engineer
Position: Not specified
Start: As soon as possible
End: Not specified
Location: Toronto, Canada
Method of collaboration: Project only
Hourly rate: Not specified
Latest update: Jun 11, 2024
Task description and requirements
Minimum 8+ years of work experience in development /Migration projects. 5+ years of experience in handling data engineering/data science-based projects with 3+ years of GCP experience.
Experience working in GCP based airflow/Kubeflow development & deployments using Google Cloud Storage, PubSub, Dataflow, Dataproc, Airflow.
Excellent problem-solving and analytical skills.
Strong communication and collaboration skills.
Ability to work independently and as part of a global team.
Passionate for data solutions.
Self-motivated and able to work in a fast-paced environment.
Detail-oriented and committed to delivering high-quality work.
Strong coding skills in languages such as Python, PySpark,Airflow/Kubeflow.
Hands-on experience with GCP data services such as Dataflow, Pub/Sub, DataProc, and Cloud Storage.
Experience using Databricks (Data Engineering and DeltaLake components).
Experience with source control tools such as GitHub and related dev process.
Experience with workflow scheduling tools such as Airflow.
Strong understanding of data structures and algorithms.
Experience building data lake solutions leveraging Google Data Products (e.g., DataProc, AI Building Blocks, Looker, Cloud Data Fusion, Dataprep, etc.),Hive, Spark.
Experience with relational SQL/No SQL.