has multiple years of experience using Snowflake as a data tool and can hit the ground running Experience Snowflake Financial services Cloud (Ideally azure) Airflow would be an advantage more »
have Terraform experience SQL & NoSQL experience Have built out Data Warehouses & built Data Pipelines Strong Databricks & Snowflake experience Docker, ECS, Kubernetes & Orchestration tools like Airflow or Step Functions are nice to have Contracts are running for 6 months initially, paying up to £450p/day (Outside IR35) and will more »
load data to Snowflake. Converting SAS based module to python based Candidates who have data management experience in Snowflake with expertise in Python, DBT, Airflow or similar technologies Must have hands on experience in DBT & SQL Snowflake (added advantage more »
Data Engineer Experience working with AI frameworks and libraries (PyTorch) Confidence collaborating in a complex and cross-functional teams Strong skills with: AWS, Python, Airflow, Snowflake Interested ? Apply now or reach out to daisy@wearenumi.com for further details and a chat more »
hands on and will require exposure to some essential tools and a high level of financial knowledge. Good exposure to AWS technologies is essential, Airflow would be desirable. Experience with batch processing is required. Experience working with a Linux environment. Scripting exposure within either Python, Bash or Shell scripting. more »
Manchester Area, United Kingdom Hybrid / WFH Options
Maxwell Bond
successful Lead Data Engineer will have: Experience leading a Data Engineering team. Extensive working experience with GCP, SQL and DBT. Proficient in: Kafka, Dataform, Airflow, Tableau, PowerBI, Redshift, Snowflake, Terraform and BigQuery. What's in it for the successful Lead Data Engineer: Hybrid working for a better work/ more »
Experience with Data Vault 2.0. Analytical mindset with attention to detail. Enthusiastic about learning and thriving in a dynamic setting. Familiarity with dbt, Snowflake, Airflow, FiveTran, and Terraform. Join and be part of a dynamic team driving innovation and excellence in data engineering more »
Greater London, England, United Kingdom Hybrid / WFH Options
Agora Talent
early-stage B2B SaaS experience involving client-facing projects • Experience in front-end development and competency in JavaScript • Knowledge of API development • Familiarity with Airflow, DBT, Databricks • Experience working with Enterprise Resource Planning (e.g. Oracle, SAP) and CRM systems. If this role sounds of interest, please apply using the more »
well as NoSQL databases Familiarity with cloud services (preferably GCP ) and understanding of its ETL frameworks. Desired : experience with data pipeline and workflow management (Airflow, Composer). Desired : Familiarity with machine learning algorithms and data science principles is a plus. Qualifications Degree in Computer Science, Engineering, or related field. more »
as thinking strategically and proactively identifying opportunities. You have experience collaborating with senior business stakeholders and finance teams. You have working knowledge of Python, Airflow, dbt, Bigquery, and Looker. Additional desirables: Experience in one or more finance domains, such as Financial Reporting, Treasury, Regulatory Reporting, Financial Planning & Analysis, Financial more »
4.Monitoring and Logging: Implement and maintain monitoring, logging, and alerting solutions. Key technologies: AWS, VPN, VPC Peering, EC2, S3, Lambda, Aurora, Docker/Kubernetes. ApacheAirflow, AWS networking concepts such as VPN, VPC peering, subnets, security groups, NAT gateways. AWS CloudWatch or equivalent. Kafka or similar data streaming more »
4.Monitoring and Logging: Implement and maintain monitoring, logging, and alerting solutions. Key technologies: AWS, VPN, VPC Peering, EC2, S3, Lambda, Aurora, Docker/Kubernetes. ApacheAirflow, AWS networking concepts such as VPN, VPC peering, subnets, security groups, NAT gateways. AWS CloudWatch or equivalent. Kafka or similar data streaming more »
trends and best practices. Qualifications: Expertise in Java and Python development (Essential). Experience with Spark or Hadoop (Essential). Knowledge of Trino or Airflow (Desirable). Proven ability to design and implement scalable and secure solutions. Excellent communication and collaboration skills. more »
proficiency GIT proficiency Linux use and admin Experience deploying cloud services (AWS is a bonus) Experience with Docker and Kubernetes Using frameworks such as AirFlow ML background - PyTorch for computer vision This is a fully remote role which comes with: Budget for WFH set up. Stock options. 25 days more »
Bristol, Avon, South West, United Kingdom Hybrid / WFH Options
Set2Recruit
proficiency GIT proficiency Linux use and admin Experience deploying cloud services (AWS is a bonus) Experience with Docker and Kubernetes Using frameworks such as AirFlow ML background - PyTorch for computer vision This is a fully remote role which comes with: Budget for WFH set up. Stock options. 25 days more »
to-end from scoping, designing, coding, release and continuous monitoring in production environment •ELT pipeline: Experience with ELT pipeline and orchestration systems such as Airflow •Database systems: Experience working with one of more of non-SQL databases such as Druid, Elasticsearch and neo4j •AWS: Experience deploying and managing applications more »
within the EU Fusion programme and connections to international HPC communities, showcasing contributions made to the field. - Experience in workflow management systems such as Apache Airflow. - Familiarity with Research Data Management methodologies, modern database technologies including SQL, NoSQL and Graph Databases, and parallel file access technologies such as MPI more »
within the EU Fusion programme and connections to international HPC communities, showcasing contributions made to the field. Experience in workflow management systems such as Apache Airflow. Familiarity with Research Data Management methodologies,modern database technologies including SQL, NoSQL and Graph Databases, and parallel file access technologies such as MPI more »
Greater London, England, United Kingdom Hybrid / WFH Options
Cera
an autonomous environment. 5+ years SQL experience developing data pipelines in a cloud based database environment (e.g.BigQuery), including scheduling your own queries (e.g. using Airflow or Stitch) 3+ years experience delivering data products in a modern BI technology (ideally Looker) or open source data frameworks Experience managing end-to more »
working on very complex systems Strong experience with Computer vision Longevity in their previous roles Experience with Remote Sensing highly desirable Stack: Python, PyTorch, AirFlow, PySpark (equivalent tools are fine more »
effective data visualisation solutions. Key Skills: Strong Python and SQL coding experience essential Experience with scripting language Proficiency in data orchestration tools (Dagster/Airflow) Proficiency in visualisation tools such as Tableau and Power BI. Proficiency with Git and CI/CD (ideally using Azure DevOps Salary more »
engineers to assist with technical work in agreed territories Key skills/knowledge: Expertise in GCP Experience in Python , SQL Knowledge and expertise in Airflow , Dataform, Git Previous experience working in a team in one of the following functions: data analytics, software or application development, database development Previous experience more »
City Of London, England, United Kingdom Hybrid / WFH Options
Cititec Talent
a leading commodities trading firm. Outside of IR35 Hybrid working - 2/3 days in London office Experience Required: - Commodities industry experience Weather forecasting Airflow (or equivalent data orchestration platforms) Data Modeling (Star Schema) Data Warehousing Data Pipeline Orchestration (Kafka) On-Prem SQL more »
Experienced creating data pipelines on a cloud (preferably AWS) environment CI/CD experience Containerization experience (Docker, Kubernetes, etc.) Experience with SQS/SNS, Apache Kafka, RabbitMQ Other interesting/bonus skills – Airflow, Trino, Apache Iceburg, Postgres, MongoDB You *must* be eligible to work in your chosen more »