Bristol, Avon, South West, United Kingdom Hybrid / WFH Options
Set2Recruit
proficiency GIT proficiency Linux use and admin Experience deploying cloud services (AWS is a bonus) Experience with Docker and Kubernetes Using frameworks such as AirFlow ML background - PyTorch for computer vision This is a fully remote role which comes with: Budget for WFH set up. Stock options. 25 days more »
to-end from scoping, designing, coding, release and continuous monitoring in production environment •ELT pipeline: Experience with ELT pipeline and orchestration systems such as Airflow •Database systems: Experience working with one of more of non-SQL databases such as Druid, Elasticsearch and neo4j •AWS: Experience deploying and managing applications more »
well as NoSQL databases Familiarity with cloud services (preferably GCP ) and understanding of its ETL frameworks. Desired : experience with data pipeline and workflow management (Airflow, Composer). Desired : Familiarity with machine learning algorithms and data science principles is a plus. Qualifications Degree in Computer Science, Engineering, or related field. more »
within the EU Fusion programme and connections to international HPC communities, showcasing contributions made to the field. Experience in workflow management systems such as Apache Airflow. Familiarity with Research Data Management methodologies,modern database technologies including SQL, NoSQL and Graph Databases, and parallel file access technologies such as MPI more »
Greater London, England, United Kingdom Hybrid / WFH Options
Cera
an autonomous environment. 5+ years SQL experience developing data pipelines in a cloud based database environment (e.g.BigQuery), including scheduling your own queries (e.g. using Airflow or Stitch) 3+ years experience delivering data products in a modern BI technology (ideally Looker) or open source data frameworks Experience managing end-to more »
working on very complex systems Strong experience with Computer vision Longevity in their previous roles Experience with Remote Sensing highly desirable Stack: Python, PyTorch, AirFlow, PySpark (equivalent tools are fine more »
effective data visualisation solutions. Key Skills: Strong Python and SQL coding experience essential Experience with scripting language Proficiency in data orchestration tools (Dagster/Airflow) Proficiency in visualisation tools such as Tableau and Power BI. Proficiency with Git and CI/CD (ideally using Azure DevOps Salary more »
engineers to assist with technical work in agreed territories Key skills/knowledge: Expertise in GCP Experience in Python , SQL Knowledge and expertise in Airflow , Dataform, Git Previous experience working in a team in one of the following functions: data analytics, software or application development, database development Previous experience more »
City Of London, England, United Kingdom Hybrid / WFH Options
Cititec Talent
a leading commodities trading firm. Outside of IR35 Hybrid working - 2/3 days in London office Experience Required: - Commodities industry experience Weather forecasting Airflow (or equivalent data orchestration platforms) Data Modeling (Star Schema) Data Warehousing Data Pipeline Orchestration (Kafka) On-Prem SQL more »
and maintain scalable data architectures, including pipelines and cloud-based data warehouses. Tech: Python (NumPy, Pandas), SQL, ETL, Cloud (AWS, Azure or GCP), Snowflake, Airflow, BigQuery, PowerBI/Tableau Industry: Fintech, Maritime trading Immersum are supporting the growth of a specialist consultancy who solely specialise in the Maritime trading … developing and optimising ETL pipelines. Version Control: Experience with Git for code collaboration and change tracking. Data Pipeline Tools: Proficiency with tools such as Apache Airflow. Cloud Platforms: Familiarity with AWS, Azure, Snowflake, and GCP. Visualisation: Tableau or PowerBI Delivery Tools: Familiarity with agile backlogs, code repositories, automated builds more »
Experienced creating data pipelines on a cloud (preferably AWS) environment CI/CD experience Containerization experience (Docker, Kubernetes, etc.) Experience with SQS/SNS, Apache Kafka, RabbitMQ Other interesting/bonus skills – Airflow, Trino, Apache Iceburg, Postgres, MongoDB You *must* be eligible to work in your chosen more »
Key technical experience: Ability to operate in a fast changing environment. Fluent in English Previous cloud based infrastructure experience, particularly with AWS. Experience using Airflow and dbt Expert SQL knowledge Solid understanding of Dimensional Data Modelling. Experience with at least one or more of these programming languages: Python, Scala …/Java Experience with distributed data and computing tools, mainly Apache Spark & Kafka Understanding of critical path approaches, how to iterate to build value, engaging with stakeholders actively at all stages. Able to deal with ambiguity and change. A self-starter who's able to work independently where necessary more »
a production setting. Knowledge of developing real-time data stream systems (ideally Kafka). Proven track record in developing data systems using PySpark and Apache Spark for batch processing. Capable of managing data intake from various sources, including data streams, unstructured data, relational databases, and NoSQL databases. Extensive knowledge … data pipelines. Proficiency in Python programming (knowledge of Scala or Rust is a plus). Familiarity with ETL principles in contemporary data applications (Dagster, Airflow, Perfect). Familiarity with AWS services such as Glue, Redshift, Athena, and S3. Proficiency with Terraform, Kubernetes, and ArgoCD (expertise not required; cloud team more »
utilising the best breed of Cloud services and technologies. So, what tools and technologies will you be using? AWS Python Databricks/Spark Trino Airflow Docker CloudFormation/Terraform SQL/NoSQL We provide you with the opportunity to think freely and work creatively and right now, is a … Other skills we are looking for you to demonstrate include: Experience of data storage technologies: Delta Lake, Iceberg, Hudi Sound knowledge and understanding of Apache Spark, Databricks or Hadoop Ability to take business requirements and translate these into tech specifications Knowledge of Architecture best practices and patterns Competence in more »
projects and new innovations to support company growth and profitability. Our Tech Stack Python Scala Kotlin Spark Google PubSub Elasticsearch Bigquery, PostgresQL Kubernetes, Docker, Airflow Ke y Responsibilities Designing and implementing scalable data pipelines using tools such as Apache Spark, Google PubSub etc. Optimising data storage and retrieval … Data Infrastructure projects, as well as designing and building data intensive applications and services. Experience with data processing and distributed computing frameworks such as Apache Spark Expert knowledge in one or more of the following languages - Python, Scala, Java, Kotlin Deep knowledge of data modelling, data access, and data more »
Python experience Experienced with AWS and services like S3. Experienced with Kafka for data streaming. Familiarity with BI reporting tools Good working experience with Airflow, PySpark and Apache Beam Worked with Java for building data applications – Advantageous Worked within the commodities space – Advantageous Not quite right for you more »
schemas to support business requirements Develop and maintain data ingestion and processing systems using various tools and technologies, such as SQL, NoSQL, ETL, Luigi, Airflow, Argo, etc. Implement data storage solutions using different types of databases, such as relational, non-relational, or cloud-based. Working collaboratively with the client …/Azure SQL, PostgreSQL) You have framework experience within either Flask, Tornado or Django, Docker Experience working with ETL pipelines is desirable e.g. Luigi, Airflow or Argo Experience with big data technologies, such as Apache Spark, Hadoop, Kafka, etc. Data acquisition and development of data sets and improving more »
. Experience with SQL and query design on large, complex datasets. Experience with cloud and big-data tools and frameworks like Databricks/Spark, Airflow, Snowflake, etc. Expertise designing and developing with distributed data processing platforms like Databricks/Spark. Experience using ELT/ETL tools such as DBT … FiveTran, etc. Understanding of Agile Delivery best practice Good knowledge of the relevant technologies e.g. SQL, Oracle, PostgreSQL, Python, ETL pipelines, Airflow, Hadoop, Parquet. Strong problem-solving and analytical abilities. Ability to present solutions and limitations to non-IT business experts ABOUT YOU Integrity, respect, intellectually curious and an more »
the above project of redesigning the Creditsafe platform into the cloud space. You will be expected to work with technologies such as Python, Linux, Airflow, AWS DynamoDB, S3, Glue, Athena, Redshift, lambda, API Gateway, Terraform, CI/CD. KEY DUTIES AND RESPONSIBILITIES You will actively contribute to the codebase … and participate in peer reviews. Design and build metadata driven, event based distributed data processing platform using technologies such as Python, Airflow, Redshift, DynamoDB, AWS Glue, S3. As an experienced Engineer, you will play a critical role in the design, development, and deployment of our business-critical system. You more »
to some SAS development on legacy projects when required. Python or PySpark and SQL will be your bread and butter, with any experience of Airflow being a great bonus. The core skillset: Python/PySpark for building scalable & robust data pipelines Experience with Airflow or other orchestration tools more »
Databricks • Must Have Hands on experience on at least 2 Hyperscalers (GCP/AWS/Azure platforms) and specifically in Big Data processing services (Apache Spark, Beam or equivalent). • In-depth knowledge on key technologies like Big Query/Redshift/Synapse/Pub Sub/Kinesis/… MQ/Event Hubs, Kafka Dataflow/Airflow/ADF etc. • Excellent consulting experience and ability to design and build solutions, actively contribute to RfP response. • Ability to be a SPOC for all technical discussions across industry groups. • Excellent design experience, with entrepreneurship skills to own and lead solutions more »
leading marketplace platform is looking for a Principal Data Engineer to take a lead on several aspects of their data platform built in Python, Airflow, Kafka, Sagemaker, AWS and GCP. 💰 Salary: up to £130k + 15% bonus 📍 Location: can be based anywhere in the UK ✅ Must have requirements: Mastery … of Python and SQL Significant experience with orchestration tools, ideally Airflow Strong Data Architecture experience Cloud expertise in either GCP or AWS (ideally in both) Knowledge of MPP systems such as Athena, BigQuery, EMR, Hive, Iceberg Exposure to data streaming technologies such as Kafka or Kinesis more »
processes, effective monitoring, and infrastructure-as-code using Terraform. Collaborate closely with our engineering teams to support the orchestration of our ETL pipelines using Airflow and manage our tech stack including Python, Next.js, Airflow, PostgreSQL MongoDB, Kafka and Apache Iceberg. Optimize infrastructure costs and develop strategies for more »
financial services or energy trading industry Expertise in Python and its ecosystem of libraries and frameworks for data processing, data analysis and data visualisation Airflow, detailed understanding of architecture including schedulers, executers, operators Cloud Environments, understanding of principles, technologies and services for AWS/Azure Kubernetes EKS/AKS … including high availability Desired experience: Worked with Python 3.9+ Familiar with Python test automation Experience with SQL and Timeseries databases Familiar with Parquet, Arrow, Airflow, Databricks Experience with cloud AWS services, such as S3, EC2, RDS etc Quality engineering best practice and tooling including TDD, BDD This is an more »