X2 Data Engineer Are you a Data Engineer with Databricks who is familiar with AWS or GCP (preferably both)? My customer is expanding their Distribution Center. Work will include everything from public-sector transformation to cutting-edge commercial enterprises. This Data Engineer position will require you to work in the Data Operations area of a Delivery Centre, which provides services to both private and public sector clients. These tasks can be performed remotely with the goal of spending one day per month in a local office to collaborate with your team, although access to an office is available if desired. All successful applicants must have, or be eligible for, UK Security Clearance (SC), with the intention of moving to higher level clearance if necessary. Your future duties and responsibilities will be as follows: You will be part of a team tasked with developing and managing Big Data infrastructure and technologies, such as big data pipelines and platforms. You will work on projects that are within your comfort zone, but you will also have access to work on work and training programs that will allow your career to progress in the direction you desire. Qualifications required for success in this role: We're looking for a combination of the following.... What is the Azure Data Stack? AWS: Dynamo, Lambda, S3, IAM, Secrets Manager, DMS, EC2, EMR, DMS, SQS, Cloudwatch What are GCP Services? YES PLEASE to Spark and Databricks! ? Composer, Cloud Functions, GCP Logging? What is the difference between Terraform, Cloudformation, Polumi, and Ansible? What is Python? Git ? SQL Server / Postgres / Cassandra / SQL knowledge & Schema Experience in management? helm, K8s? Kafka ? What about airflow? Parquet / Snowflake BI? What is a Datadog? Scala or Java? API Development / Microservices? What is Docker? Jenkins, methods for CI/CD Skills: Management of Compliance Engineering of Data What's on the menu: #4195998 salary up to £74k additional bonus
X2 Data Engineer Are you a Data Engineer with Databricks who is familiar with AWS or GCP (preferably both)? My customer is expanding their Distribution Center.
Work will include everything from public-sector transformation to cutting-edge commercial enterprises.
This Data Engineer position will require you to work in the Data Operations area of a Delivery Centre, which provides services to both private and public sector clients.
These tasks can be performed remotely with the goal of spending one day per month in a local office to collaborate with your team, although access to an office is available if desired.
All successful applicants must have, or be eligible for, UK Security Clearance (SC), with the intention of moving to higher level clearance if necessary.
Your future duties and responsibilities will be as follows:
You will be part of a team tasked with developing and managing Big Data infrastructure and technologies, such as big data pipelines and platforms.
You will work on projects that are within your comfort zone, but you will also have access to work on work and training programs that will allow your career to progress in the direction you desire.
Qualifications required for success in this role:
We're looking for a combination of the following....
What is the Azure Data Stack? AWS:
Dynamo, Lambda, S3, IAM, Secrets Manager, DMS, EC2, EMR, DMS, SQS, Cloudwatch What are GCP Services? YES PLEASE to Spark and Databricks! ? Composer, Cloud Functions, GCP Logging? What is the difference between Terraform, Cloudformation, Polumi, and Ansible? What is Python? Git ? SQL Server / Postgres / Cassandra / SQL knowledge & Schema Experience in management? helm, K8s? Kafka ? What about airflow? Parquet / Snowflake BI? What is a Datadog? Scala or Java? API Development / Microservices? What is Docker? Jenkins, methods for CI/CD Skills:
Management of Compliance Engineering of Data What's on the menu:
#4195998 salary up to £74k additional bonus