forecasting, propensity modelling, predictive/prescriptive campaign performance analysis, and optimising marketing spend Programming proficiency in Python, SQL, Bash, and Excel, including experience with Jupyter notebooks, type-checking, functional programming, PyTest, Pandas, SciKit, PyTorch, CI/CD, and Git Experience with Docker, Kubernetes, and cloud platforms such as AWS, Databricks more »
working in a tech team using a diverse tech stack including: Backend: Python, FastAPI, PostgreSQL, Vespa, SQLAlchemy, Flask. Frontend: React, Next.js. Data Science: Python, Jupyter, PyTorch, Pandas, Spacy, Huggingface, Numpy, Streamlit, Weights and biases. Infra: Pulumi, Docker, AWS AppRunner, Step Functions, Grafana cloud monitoring, Prefect. Who you are Must haves more »
skills with SQL, working with large and complex data sets to extract insights and identify trends Advanced programming skills with Python, including experience with Jupyter notebooks, PyTest, Pandas, SciKit, PyTorch, type-checking, functional programming, CI/CD, and Git A borad background in machine learning for customer and marketing purposes more »
in order to fully understand the data; programming in Python or Ruby, utilizing AWS S3, MongoDB, PostgreSQL, AWS Redshift or similar database technologies; using Jupyter notebooks and one or more statistical visualization or graphing toolkits such as Excel, Qlik Sense or Tableau. Technical Blog Posts Read more about what our more »
Manchester, England, United Kingdom Hybrid / WFH Options
Anson McCade
development, including research and developing new propositions. • Proficiency in Data Science and Machine Learning services from Cloud providers (AWS, Azure, GCP). • Python and Jupyter ecosystem (Lab, Notebook) and related libraries and tooling. • Python libraries for data management, statistical analysis, machine learning, and visualisation. • Machine learning frameworks such as TensorFlow more »
London, England, United Kingdom Hybrid / WFH Options
Anson McCade
development, including research and developing new propositions. • Proficiency in Data Science and Machine Learning services from Cloud providers (AWS, Azure, GCP). • Python and Jupyter ecosystem (Lab, Notebook) and related libraries and tooling. • Python libraries for data management, statistical analysis, machine learning, and visualisation. • Machine learning frameworks such as TensorFlow more »
leverage the data in detecting and mitigating abusive HTTP L7 layer traffic. Tech stack includes Go, Python, JavaScript, Rust, Lua, Kafka, Kubernetes, ClickHouse, PostgreSQL, Jupyter notebook.Examples of desirable skills, knowledge and experienceLead and grow engineering teams committed to building the innovative security productsBe a strategic thinker, be able to influence more »
Greater London, England, United Kingdom Hybrid / WFH Options
Anson McCade
development, including research and developing new propositions. Proficiency in Data Science and Machine Learning services from Cloud providers (AWS, Azure, GCP). Python and Jupyter ecosystem (Lab, Notebook) and related libraries and tooling. Python libraries for data management, statistical analysis, machine learning, and visualisation. Machine learning frameworks such as TensorFlow more »
broader Python ecosystem Django, NumPy, pandas, scikit-learn, TensorFlow, PyTorch, etc.- Experience in machine learning, data science, and data visualization, using tools such as Jupyter, matplotlib, seaborn, plotly, etc.- Knowledge of web development, RESTful APIs, cloud computing, and database technologies- A strong interest and curiosity in biotechnology and its applicationsIf more »
Edinburgh, Scotland, United Kingdom Hybrid / WFH Options
Primis
of Data warehousing Technical expertise with data models, data mining, and segmentation techniques Technical Lead Data Engineer Desirable's: Knowledge of Data bricks and Jupyter notebooks Proven experience of Azure Services Experience in data science and Machine learning models Knowledge of TFS/Azure DevOps. AWS | GCP | Azure | TSQL | SQL more »
Bloomberg, ISS and MSCI Significant experience with building robust data-oriented solutions in Python, including web apps Experience with modern data tools, ideally Tableau, Jupyter Notebooks, Pandas and Python visualisation libraries Exposure to machine learning approaches such as neural networks, NLP, generative AI desirable Experience working at an investment management more »
Edinburgh, Scotland, United Kingdom Hybrid / WFH Options
Primis
and segmentation techniques Knowledge of programming languages (e.g. Python, R) Data Engineer Desirable's: Proven experience of Azure Services Knowledge of Data bricks and Jupyter notebooks Experience in data science and Machine learning models Azure | SQL | ETL | Data Models | Python | Design | R | Data Mining | Hybrid | Edinburgh | Please apply if you more »
as NLI, Q&A answering using LLMs. Working knowledge of deep learning libraries such as PyTorch or tensorflow. Demonstratable Python experience including coding in Jupyter Notebooks and also the ability to write productionised code, for example, a Flask API to serve a model. Ability to demonstrate the regular use of more »
Redshift, AWS S3 Environments/Infra: AWS (required), [AWS Lambda, Terraform] (nice to have) Platforms: Creating data pipelines within Databricks or equivalent such as Jupyter Notebook, PowerBI (nice to have) of Enterprise Models for Engineering: microservices, APIs, AWS services to support models (SQS, SNS, etc.) Please Note, The role is more »
Python programming but also operate with modern tech such as Snowflake, Airflow, DBT, Kubernetes/Docker and cubeJS whilst also using Tableau CRM and Jupyter notebooks for gaining data insights. If you think there’s a better or newer tool, you’re free to use that too. The key think more »
targets for experimental testing. Be familiar with NGS and associated pipelines. Collate and annotate reference sequences across multiple microorganisms. Be confident using python and Jupyter Lab books as a working and application development environment along with GIT as a version control system. Requirements include; MSc degree or equivalent in a more »
preferably with the MLOps context. -Understanding or familiar with technologies such as: Scikit-Learn, TensorFlow, Torch, ChatGPT, Llama, LangChain (or equivalent), RAG, Model Security, Jupyter Notebook/JupyterLab, Unit Testing, Integration Testing, E2E Testing, ETL/ELT, Python Virtual Environment, Containerization with Docker/Podman, GitLab, Azure DevOps, Kubernetes, Azure more »
with the MLOps context . -Understanding or familiar with technologies such as: Scikit-Learn, TensorFlow, Torch, ChatGPT, Llama, LangChain (or equivalent), RAG, Model Security, Jupyter Notebook/JupyterLab, Unit Testing, Integration Testing, E2E Testing, ETL/ELT, Python Virtual Environment, Containerization with Docker/Podman, GitLab, Azure DevOps, Kubernetes, Azure more »
platform. You will be part of a small team, working in a relaxed environment, and using a range of modern open-source tools like Jupyter, PyTorch and Spark. You will be working with terabytes of data, and will be part of a team that can quickly run experiments at scale more »
This is a new position for a Data Scientist with a global, data-driven company with cutting-edge technology who leverage data to serve as a true market differentiator. The focus of this role is to deliver data science solutions more »