Nottingham, Nottinghamshire, East Midlands, United Kingdom Hybrid / WFH Options
Develop
Mainframe Scheduling Engineer £Negotiable 1 Year Contract Fully Remote Primary role will be to provide advance operational support and scheduling expertise for Mainframe Batchprocessing, including but not limited to Batch Job Failure Management Batch Job Processing Management Queue Management Ad-hoc processing request … Virtual Resource Management Batch Job Schedule Management Creating new schedules, delete redundant schedules. Release Code Management Support JCL Management Support Critical Path Monitoring Represent the team at technical/business meetings/events in a professional and assertive manner. Communicate and collaborate with Business & Operational Teams Mainframe Batch … Manage Mainframe Scheduling tools in line with Experian security, compliance, and policies, using standard tools, e.g. CyberArk, ServiceNow Potential on-call support (24x7x365) for Batch Workloads Skills: Required Skills Knowledge and experience with Mainframe systems CA7 CA11 Mainframe Operations in general - JCL, ISPF, JES2, z/OS Desirable skills more »
basis. Some paid overtime maybe required from time to time. Primary role will be to • Provide advance operational support and scheduling expertise for Mainframe Batchprocessing, including but not limited to o Batch Job Failure Management o Batch Job Processing Management Queue Management Ad-hoc … processing request Virtual Resource Management o Batch Job Schedule Management Creating new schedules, delete redundant schedules. o Release Code Management Support o JCL Management Support o Critical Path Monitoring o Represent the team at technical/business meetings/events in a professional and assertive manner. o Communicate … and collaborate with Business & Operational Teams o Mainframe Batch Workload Problem escalations (internal and vendor support) o Follow ITIL Business processes, Change, Incident, Problem and Root Cause Analysis processes. o Manage Mainframe Scheduling tools in line with security, compliance, and policies, using standard tools, e.g. CyberArk, ServiceNow o Potential more »
expand the integration of our custom compiler technology with the internals of leading open-source software for Big Data analytics to drive faster data processing and broader operation coverage. Your contributions to our core product will directly impact data infrastructure processing 10s of petabytes every day where Xonai … is being deployed. What you will do: Implement extensible optimizations for various query processing components such as columnar batchprocessing, data reading, shuffling and data partitioning algorithms. Implement code generation targeting Xonai custom DSL to accelerate SQL operations to be supported. Run profiling and benchmark tools to … diligently write, test and deploy production code driven by modern software engineering practices. Collaborate with team members to drive new innovations on big data processing lying at the intersection of Big Data and compilers. What you will bring: 3+ years of experience working with performance optimization and complex codebases. more »
automated systems development e.g. in a hedge fund or investment bankExpertise in building distributed systems with large data warehouses and both on-line and batch processingExperience of web-based development and visualisation technology for portraying large and complex data sets and relationshipsSubstantial quant development engineering experience with relevant mathematical more »
growing Software Development team. You will engage in all phases of the software development lifecycle, focusing on creating a low & no-code solution for batch data operations in reporting and visualization. Their solution leverages Azure services, including Azure Web App, Storage Accounts, Azure Data Factory, and Databricks, with Python … and C# as primary languages. Responsibilities: Collaborate on data harmonization and transformation requirements. Develop and maintain data processing libraries and pipelines. Optimize data processes for performance and scalability. Ensure data quality through validation and testing. Requirements: 5+ years of software development experience. Expertise in Databricks, Python, and cloud deployment … preferably Azure). Experience with batchprocessing data. Nice to Have: Market research or retail data management experience. Familiarity with Agile, TDD, and Lean methodologies. Experience with Large Language Models and React. This role is fully remote. Can be based anywhere across Spain, France and the United Kingdom. more »
cost-effective manner. Your expertise in cloud infrastructure will be crucial in ensuring the reliability and availability of the company's software products. Data Processing Pipelines : You'll design and implement data processing pipelines using technologies like Kafka, Hadoop, Hive, Storm, or Zookeeper, enabling real-time and batchprocessing of data from the blockchain. Hands-on Team Leadership : As a hands-on leader, you'll lead by example, actively contributing to the development efforts while also mentoring and coaching the team members. Your leadership will help foster a culture of excellence and continuous improvement within the more »
Experience with Apache Spark and/or Databricks. Familiarity with BI visualization tools like Power BI Experience in managing end-to-end analytics pipelines (batch and streaming) Proficiency in root cause analysis and post-incident reporting in a production environment Excellent interpersonal skills Strong report writing, presentation, and record … Computer Science, Engineering, or related field Certifications such as Azure Data Engineer Associate are desirable. Knowledge of data ingestion methods for real-time and batchprocessing Proficiency in PySpark and debugging Apache Spark workloads. What’s in it for you? Annual bonus scheme – up to 10% Excellent pension more »
SR2 | Socially Responsible Recruitment | Certified B Corporation™
services, such as Data Factory, Databricks, Azure SQL and Synapse Analytics, among others, to build scalable and secure data solutions that support complex onward processing by both operational and analytical teams. What will this role look like? - Develop and maintain scalable, efficient, and robust data architectures. This involves creating … 3NF and dimensional data models, designing and developing robust batchprocessing pipelines, and working with the operational and analytical teams to solve complex business problems. - Monitor system performance, identify bottlenecks, and implement changes to improve data processing and pipeline/systems performance. - Work closely with Data Scientists more »
Edinburgh, City of Edinburgh, United Kingdom Hybrid / WFH Options
Cathcart Technology
Lead Data Platform Engineer, you'll be helping the team handle vast amounts of data which constantly pours in. You'll be joining the batchprocessing team where you'll play a pivotal role in developing products utilized in-house by both Data Engineers and Software Engineers. For more »
Edinburgh, Midlothian, Scotland, United Kingdom Hybrid / WFH Options
Cathcart Technology
Lead Data Platform Engineer, you'll be helping the team handle vast amounts of data which constantly pours in. You'll be joining the batchprocessing team where you'll play a pivotal role in developing products utilized in-house by both Data Engineers and Software Engineers. For more »
Azure Stream Analytics, and Azure Synapse Analytics. A solid foundation in data engineering, data science, and product development, encompassing experience in both stream and batch processing. Designing and deploying production data pipelines, utilizing languages such as Java, Python, Scala, Spark, and SQL. Handling substantial volumes of structured and unstructured more »
/digital output, email, and SMS) and enterprise database technologies. Output stream awareness around Postscript, AFP, PDF, and HTML channels. Template Upgrade & Migration, Normalization, Batch Process, Data process like formatting, mapping (XML) etc., Solid Knowledge & hands on working experience in the scripts knowledge is mandatory for the following skills more »
Familiarity with streaming/event-driven architecture, including tools like Kafka and Kinesis/Firehose. Expertise in data ingestion techniques, encompassing both stream and batch processing. Proficiency using tools like Terraform for Infrastructure-as-Code and AWS infrastructure management. What's in it for you? Free Membership and more »
to some essential tools and a high level of financial knowledge. Good exposure to AWS technologies is essential, Airflow would be desirable. Experience with batchprocessing is required. Experience working with a Linux environment. Scripting exposure within either Python, Bash or Shell scripting. Experience with MSSQL, Oracle or more »
interacting & supporting 3rd party vendors. Responsibilities: Responsible for the support & integration of Guidewire application to business estates Identify opportunities and propose integration solutions including batchprocessing and message queues Manage and provide technical for Guidewire applications in production. Liaise & interact with 3rd party vendors Engage with technical teams more »
interacting & supporting 3rd party vendors. Responsibilities: Responsible for the support & integration of Guidewire application to business estates Identify opportunities and propose integration solutions including batchprocessing and message queues Manage and provide technical for Guidewire applications in production. Liaise & interact with 3rd party vendors Engage with technical teams more »
Health and Safety and Company Policies. What skills will you need? Essential qualifications/requirements: High volume manufacturing experience to include continuous processes and batch processes Significant experience of working with a maintenance background Experience in electrical and mechanical fault finding Planned maintenance systems and procedures development experience Proven more »
CHO/HEK cell lines. Develop stable CHO cell lines to scale up antibody production to multi-gram quantities. Optimise antibody production using fed-batch processes in bioreactors, ensuring high titer and quality. Innovate within the protein expression domain , staying abreast of and implementing cutting-edge technologies. Candidate Expectations more »
Greater London, England, United Kingdom Hybrid / WFH Options
Anson McCade
Glue, RedShift, Kinesis, Lambda, Data Factory, and Databricks. Strong experience in data engineering, data science, and product development, encompassing experience in both stream and batch processing. Proficiency in the design and deployment of production data pipelines, involving languages like Java, Python, Scala, Spark, and SQL. You should also have … including the integration of data through AWS and Azure cloud-based ingestion and curation processes. Knowledge of data mining, machine learning, and natural language processing is considered an advantage. Familiarity with Agile methodologies and participation in Scrum ceremonies. Competence in the design and construction of solutions tailored for the more »
Glue, RedShift, Kinesis, Lambda, Data Factory, and Databricks. Strong experience in data engineering, data science, and product development, encompassing experience in both stream and batch processing. Proficiency in the design and deployment of production data pipelines, involving languages like Java, Python, Scala, Spark, and SQL. You should also have … including the integration of data through AWS and Azure cloud-based ingestion and curation processes. Knowledge of data mining, machine learning, and natural language processing is considered an advantage. Familiarity with Agile methodologies and participation in Scrum ceremonies. Competence in the design and construction of solutions tailored for the more »
technologies for ETL, data warehouse, and data lake design. • Hands-on experience with AWS services like EMR, Glue, RedShift, Kinesis, Lambda, DynamoDB. • Capable of processing large volumes of structured and unstructured data on AWS. • Familiarity with AWS best practices in data engineering, data science, and product development. • Knowledgeable in … both stream and batch processing. • Comfortable designing and building for AWS cloud, including Platform-as-a-Service, server-less, and container technologies. • Diverse skills in ETL, data warehouse, and Data Lake design. • Familiarity with various tools, cloud technologies, and approaches. • Expertise in AWS or equivalent open-source tools. • Proficient more »
Knowledge of developing real-time data stream systems (ideally Kafka). Proven track record in developing data systems using PySpark and Apache Spark for batch processing. Capable of managing data intake from various sources, including data streams, unstructured data, relational databases, and NoSQL databases. Extensive knowledge in distributed systems … visibility. Play a major role in the development and optimisation of data systems to support our Data Lake/Data Mesh architecture and data processing needs, ensuring the provision of scalable, high-performance data products for the entire organisation and proper data governance. Utilise and improve our current AWS … possible solutions. This position offers the opportunity to work with a bright and varied team to develop and improve our data platform for data processing, access, gathering, and monitoring, building the future of our data infrastructure. more »
be able to provide an accurate assessment of the current capabilities and limitations of the systems. They will understand the critical path of their batch and the associated downstream, upstream and business impacts. They will work closely with Level 3 team and other cross functional teams and check the … expectation setting. * You will actively work on and resolve any IT risks and vulnerabilities across the application stack. * You will contribute towards ensuring all batch SLA's/OLA's are appropriate, understood and agreed for the critical path of your batch processes, ensuring that you understand the more »
be able to provide an accurate assessment of the current capabilities and limitations of the systems. They will understand the critical path of their batch and the associated downstream, upstream and business impacts. They will work closely with Level 3 team and other cross functional teams and check the … expectation setting. * You will actively work on and resolve any IT risks and vulnerabilities across the application stack. * You will contribute towards ensuring all batch SLA's/OLA's are appropriate, understood and agreed for the critical path of your batch processes, ensuring that you understand the more »