help them further grow their already exciting business. Within this role, you will be responsible for Maintaining, supporting and expanding existing data pipelines using DBT, Snowflake and S3. You will also be tasked with implementing standardised data ingress/egress pipelines coupled with onboarding new, disparate data sets, sourced from more »
for the entire organisation and proper data governance. Utilise and improve our current AWS-based data platform. Work with our tech stack, which includes dbt/DuckDB for transformation, Kafka/RabbitMQ as a streaming platform, Deltalake as a data format, Dagster for managing data assets, and Terraform, Kubernetes, and more »
good grasp of frameworks like DropWizard. Lakehouse Architectures: Familiarity with modern data technologies such as Dremio, Snowflake, Iceberg, (Py)Spark/Glue/EMR, dbt, and Airflow/Dagster. AWS Services: Hands-on experience with AWS, especially S3, ECS, and EC2/Fargate. Collaborative Approach: Proven ability to work effectively more »
AWS infrastructure, including S3, Redshift, Lambda, Step Functions, DynamoDB, AWS Glue, RDS, Athena, Kinesis, Quicksight. We also widely use other tech such as Snowflake, DBT, Databricks, Informatica, Matillion, Airflow, Tableau, Power BI etc. The Lead Data Architect will liaise with clients to define requirements, refine solutions and ultimately hand over more »
s largest clients Develop solutions to parse and process tabular data from PDF and HTML documents Maintain, support and expand existing data pipelines using DBT, Snowflake and S3 Implement standardised data ingress/egress pipelines Onboard new, disparate data sets, sourced from many and varied data vendors, covering all asset more »
and tracking the technology innovations applicable to the solutions. Proven experience of working effectively with senior business stakeholders Experience of using tools including Snowflake, DBT, ADF and Azure Synapse Ability to lead teams and projects towards a common architecture approach and language. Strong communication and collaboration skills. Additional Information Legal more »
Data Pipeline Development: Design and construct data pipelines to automate data flow, involving ETL processes as needed. Modern tech stack - Python, AWS, Airflow and DBT Must haves: A team player, happy to work with several teams, this is key as you will be reporting directly to the CTO. 2 + more »
Airflow, Snowflake, etc. Expertise designing and developing with distributed data processing platforms like Databricks/Spark. Experience using ELT/ETL tools such as DBT, FiveTran, etc. Understanding of Agile Delivery best practice Good knowledge of the relevant technologies e.g. SQL, Oracle, PostgreSQL, Python, ETL pipelines, Airflow, Hadoop, Parquet. Strong more »
Python/Javascript/C# Familiarity with statistical/machine learning/AI concepts and techniques Understanding of data pipeline/orchestration tools e.g. dbt, dataform Appreciation of GCP’s serverless technologies e.g. Cloud Run/Workflows Understanding of Google’s marketing stack, Google Analytics, Google Tag Manager, Google Ads more »
tools such as Looker is highly advantageous Experience working with cloud data warehouses, ideally with AWS/Redshift, Azure, GCP, or Snowflake Experience with dbt is highly advantageous Responsibilities Analyze, organize, and prepare raw data for modeling and data analytics Architect and assist in building data systems and pipelines Evaluate more »
Google Cloud Platform (GCP) - Background using Airflow/Cloud Composer with Python - Cloud-Based data platforms, Snowflake or BigQuery - Advanced SQL - Data Transformation tools, DBT - CI/CD - TDD If you're open to exploring this opportunity and believe your skills align with what we're looking for, I'd more »
Google Cloud Platform (GCP) - Background using Airflow/Cloud Composer with Python - Cloud-Based data platforms, Snowflake or BigQuery - Advanced SQL - Data Transformation tools, DBT - CI/CD - TDD If you're open to exploring this opportunity and believe your skills align with what we're looking for, I'd more »
Manchester, North West, United Kingdom Hybrid / WFH Options
True Worth Consulting Ltd
MySQL). Experience with RESTful API frameworks. Development Opportunities: Successful candidates will have the opportunity to pursue advanced certifications in key technologies such as DBT, Snowflake, and Azure SQL. There is potential for career advancement to Data Architect or product engineering roles, focusing on solving complex client challenges. Desired Values more »
Required Skills and Experience: Extensive experience in: Data Warehousing Data Engineering, overall Data Analytics Data Visualisation Proficiency in: Google Cloud (GCP) GCP BigQuery Python DBT or similar FastAPI or similar Airflow or similar Desirable: Google Apigee (as an application developer) Exposure to Machine Learning projects Exposure to DataOps Exposure to more »
thinking strategically and proactively identifying opportunities. You have experience collaborating with senior business stakeholders and finance teams. You have working knowledge of Python, Airflow, dbt, Bigquery, and Looker. Additional desirables: Experience in one or more finance domains, such as Financial Reporting, Treasury, Regulatory Reporting, Financial Planning & Analysis, Financial Risk, and more »
with AWS, especially AWS services focused on data flow, pipelines, data transformation, storage and streaming. Excellent data engineering skills, for example with SQL, Python, DBT and Airflow. Good understanding of service-oriented architecture; experience of exposing and consuming data via APIs, streams and webhooks. Confidence in coding, scripting, configuring, versioning more »
production via machine learning engineering Hands-on knowledge of NoSQL and relational Databases, alongside relevant data modelling techniques ETL/ELT tooling Knowledge of DBT, Luigi or similar orchestration tooling of managing and developing a team in an agile environment Preferred Software development of a product/service provided as more »
modelling. Bonus: Experience with Data Vault 2.0. Analytical mindset with attention to detail. Enthusiastic about learning and thriving in a dynamic setting. Familiarity with dbt, Snowflake, Airflow, FiveTran, and Terraform. Join and be part of a dynamic team driving innovation and excellence in data engineering more »