with data orchestration tools: e.g. Apache Airflow, Dagster. Experience with big data storage and processing technologies: e.g. DBT, Spark, SQL, Athena/Trino, Redshift, Snowflake, RDBMSs (PostgreSQL/MySQL). Knowledge of event-driven architectures and streaming technologies: e.g. Apache Kafka, Kafka Streams, Apache Flink. Experience with public cloud environments more »
with data orchestration tools: e.g. Apache Airflow, Dagster. Experience with big data storage and processing technologies: e.g. DBT, Spark, SQL, Athena/Trino, Redshift, Snowflake, RDBMSs (PostgreSQL/MySQL). Knowledge of event-driven architectures and streaming technologies: e.g. Apache Kafka, Kafka Streams, Apache Flink. Experience with public cloud environments more »
engineering principles, processes, and technologies.Strong SQL skills (ideally with Azure SQL), experience working with relational databases, and programming experience in Python or Scala.Experience with Snowflake and its tooling (Snowpark, Snowpipe, etc.).Familiarity with Fivetran, DBT, TensorFlow, PyTorch, and other modern data stack components.Knowledge of data integration and ETL frameworks and more »
processes, and technologies. Strong SQL skills (ideally with Azure SQL), experience working with relational databases, and programming experience in Python or Scala. Experience with Snowflake and its tooling (Snowpark, Snowpipe, etc.). Familiarity with Fivetran, DBT, TensorFlow, PyTorch, and other modern data stack components. Knowledge of data integration and ETL more »
GCP DataProc or GCP Cloud Data Fusion. * NoSQL Databases. Dynamo DB/Neo4j/Elastic, Google Cloud Datastore. * BigQuery and Data Studio/Looker. * Snowflake Data Warehouse/Platform * Streaming technologies and processing engines, Kinesis, Kafka, Pub/Sub and Spark Streaming. * Experience of working CI/CD technologies, Git more »
GCP DataProc or GCP Cloud Data Fusion. • NoSQL Databases. Dynamo DB/Neo4j/Elastic, Google Cloud Datastore. • BigQuery and Data Studio/Looker. • Snowflake Data Warehouse/Platform • Streaming technologies and processing engines, Kinesis, Kafka, Pub/Sub and Spark Streaming. • Experience of working CI/CD technologies, Git more »
varying data proficiency Thorough understanding of data lake and data warehousing principles and full project involvement in one or more major technology platforms, e.g. Snowflake, Databricks Proven experience with one or more Cloud Services provider, e.g. AWS, Azure or Google Cloud Platform. Good understanding of role-based access control, its more »
London (city), London, England Hybrid / WFH Options
T Rowe Price
Python, Java 11+ or similar, with 6+ years of professional experience. A good understanding of modern lakehouse architectures and corresponding technologies, such as Dremio, Snowflake, Iceberg, (Py)Spark/Glue/EMR, dbt and Airflow/Dagster. Experience with Cloud providers. Familiarity with AWS S3, ECS and EC2/Fargate more »
London (city), London, England Hybrid / WFH Options
T Rowe Price
Python, Java 11+ or similar, with 6+ years of professional experience. A good understanding of modern lakehouse architectures and corresponding technologies, such as Dremio, Snowflake, Iceberg, (Py)Spark/Glue/EMR, dbt and Airflow/Dagster. Experience with Cloud providers. Familiarity with AWS S3, ECS and EC2/Fargate more »
Salford, Greater Manchester, North West, United Kingdom Hybrid / WFH Options
Aj Bell Limited
available for D2C data and insights team to develop solutions from. Be subject matter expert in D2C team for cloud-based data solutions (including Snowflake and AWS Aurora). Work with Data Ingestion Specialist and Enterprise Architecture teams to develop implement robust data pipelines and ETL processes to streamline data more »
South East London, England, United Kingdom Hybrid / WFH Options
Durlston Partners
team, focused on building a massively distributed cloud-hosted data platform that would be used by the entire firm.Tech Stack: Python/Java, Dremio, Snowflake, Iceberg, PySpark, Glue, EMR, dbt and Airflow/Dagster.Cloud: AWS, Lambdas, ECS servicesThis role would focus on various areas of Data Engineering including:End to more »
solutions including the choice of data sources and ETL approach Familiar with engineering processes for developing APIs Understanding the principles of building solutions using Snowflake, open-source frameworks, multi-cloud infrastructure In Return: A bonus scheme that pays up to 20%, and a benefits package that is one of the more »
South East London, England, United Kingdom Hybrid / WFH Options
Orbis Group
CI/CD.professional experience with SQL and data transformation, ideally with dbt or similar.with at least one of these Cloud technologies: AWS, Microsoft Azure, Snowflake, GCP.Apply to the RoleRoles like these are snapped up very quickly, so act now if you do not want to miss out! Reply to this more »
CD. professional experience with SQL and data transformation, ideally with dbt or similar. with at least one of these Cloud technologies: AWS, Microsoft Azure, Snowflake, GCP. Apply to the Role Roles like these are snapped up very quickly, so act now if you do not want to miss out! Reply more »
for achieving project success.Key Responsibilities:1. Software Development:o Write high-quality, maintainable code using languages such as Python andSQLo Establish data tools like Snowflake and Azure Data Lake Services (ADLS)Gen 2o Utilize PowerBI, Tableau, or similar tools to design and create interactive andvisually appealing dashboards and reports.o Work more »
for achieving project success. Key Responsibilities: Software Development: Write high-quality, maintainable code using languages such as Python and SQL Establish data tools like Snowflake and Azure Data Lake Services (ADLS) Gen 2 Utilize PowerBI, Tableau, or similar tools to design and create interactive and visually appealing dashboards and reports. more »
language such as Python or Java. Experience with big data technologies such as Hadoop and Spark. Experience with data warehousing technologies such as Redshift, Snowflake, or BigQuery. Experience with data pipeline and ETL tools such as Apache NiFi, Airflow, or Glue. Knowledge of data governance and security best practices. Strong more »
automation, data visualization tools, DevOps practices, machine learning frameworks, performance tuning, and data governance tools. Technical proficiency in Microsoft Azure SQL (PaaS & IaaS), CosmosDB, Snowflake Data Warehouse, Power Apps, Reporting Services, Tableau, T-SQL, Python Programming, and Azure Purview. If you're ready to join a dynamic team and drive more »
Coventry, England, United Kingdom Hybrid / WFH Options
WEG Tech
or equivalent experience Strong experience working as a Data Engineer, preferably in a cloud-based environment. Experience with cloud-based data storage platforms, preferably Snowflake, Azure SQL Data Warehouse or AWS Redshift. Solid practical experience and understanding of DataVault and Kimball-style data warehousing methodologies. Proficient in SQL and data more »
for achieving project success. Key Responsibilities: Software Development: Write high-quality, maintainable code using languages such as Python , SQL . Establish data tools like Snowflake and Azure Data Lake Services (ADLS) Gen 2 Utilize PowerBI, Tableau, or similar tools to design and create interactive and visually appealing dashboards and reports. more »
success. Key Responsibilities: 1. Software Development: o Write high-quality, maintainable code using languages such as Python and SQL o Establish data tools like Snowflake and Azure Data Lake Services (ADLS) Gen 2 o Utilize PowerBI, Tableau, or similar tools to design and create interactive and visually appealing dashboards and more »
to complex business requirements spanning a number of systems At least 10 years of relevant experience Hands-on in-depth experience in the following: Snowflake/DBT/Airflow Background/working experience in the following: Azure Power BI/DAX Traditional SQL (SqlServer, MySql, Postgres) JIRA, Confluence, (Github/ more »
experience with data profiling, sourcing, and cleansing routines utilizing typical data quality functions involving standardization, transformation, rationalization, linking and matching, Good knowledge of Databricks, Snowflake, Azure/AxoWS/Oracle cloud, R, Python. Additional Information Location: This role can be delivered in a hybrid nature from one of these offices more »
experience with data profiling, sourcing, and cleansing routines utilizing typical data quality functions involving standardization, transformation, rationalization, linking and matching, Good knowledge of Databricks, Snowflake, Azure/AxoWS/Oracle cloud, R, Python. Additional Information Location: This role can be delivered in a hybrid nature from one of these offices more »