DevOps Engineer
- Permanent
- Location – Hybrid/Remote. Work from London once every two weeks
Role Overview
We are seeking a highly experienced DevOps Engineer with a strong background in Google Cloud Platform (GCP) and a proven track record in building data analytics platforms or applications from scratch. As a DevOps Engineer, you will be responsible for designing, implementing, and managing the infrastructure and deployment processes required for a robust and scalable data analytics platform.
Experience
- Proven experience as a DevOps Engineer.
- Extensive knowledge and hands-on experience with GCP services, including BigQuery, Cloud Storage, Dataflow, Pub/Sub, Dataproc, and Cloud Composer.
- Strong programming and scripting skills in languages such as Python, Bash, or Go, with the ability to automate tasks and build tools.
- Expertise in designing and optimising data pipelines using frameworks like Apache Airflow or equivalent.
- Experience with real-time and batch data processing frameworks like Apache Kafka, Apache Spark, or Google Cloud Dataflow.
- Proficiency in CI/CD tools such as Jenkins, GitLab CI/CD, or Cloud Build, and version control systems like Git.
- Solid understanding of data privacy regulations and experience implementing appropriate security measures.
- Knowledge of infrastructure as code tools, such as Terraform or Deployment Manager.
- Excellent problem-solving and analytical skills, with the ability to architect and troubleshoot complex systems.
- Strong communication skills
Responsibilities
- Lead the design and implementation of a scalable and reliable infrastructure on GCP for the data analytics platform, leveraging appropriate services and tools to ensure high performance, availability, and security.
- Collaborate closely with full-stack developers, data engineers, data scientists, and other stakeholders to define and implement efficient data ingestion, processing, and storage mechanisms.
- Implement and automate deployment processes using CI/CD pipelines and configuration management tools to enable rapid and reliable software releases.
- Create processes around release, testing and automation.
- Implement and manage real-time and batch data processing frameworks, such as Apache Kafka, Apache Spark, or Google Cloud Dataproc.
- Build and maintain robust monitoring, logging, and alerting systems to ensure the health and performance of the platform.
- Ensure compliance with data privacy regulations and implement appropriate access controls and data encryption measures.
- Optimise the platform for cost efficiency, continuously monitoring and fine-tuning resource allocation, and recommending appropriate GCP services to minimise costs.
- Troubleshoot and resolve complex technical issues related to infrastructure, data pipelines, and application performance.
- Stay up to date with industry trends and best practices in DevOps, data engineering, and cloud technologies, particularly related to GCP.
Why BI:PROCSI?
We started this company with a goal — a goal to be the very best. We don’t just believe it; we know our team is our biggest asset. We’re a group of passionate innovators (*nerds), obsessed with personal growth, that believes in challenging the status quo to ensure we come up with the best solutions.
We have a phenomenal culture, unparalleled drive, and every single person in our team is very carefully selected to make sure we maintain this. We are diverse, and we celebrate that. We are whole people, with families, hobbies and lives outside of work and make sure we have a healthy work-life balance.
We are rapidly expanding and on a growth trajectory. We are continuously hiring at all levels across Business Intelligence, Analytics, Data Warehousing, Data Science and Data Engineering.
BI:PROCSI is actively committed to encouraging equality, diversity and inclusion throughout our workforce.
Our aim here at BI:PROCSI is to represent all selections of society and for each member of the team here at BI:PROCSI to feel respected in a balanced working environment to be able to give their very best.