Data Scientist - Auto Scoring

Are you passionate about using data to drive change? Do you have excellent Data Science skills alongside the ability and desire to work with users of data and to communicate well?
Scroll to content

Job purpose and background

Are you passionate about using data to drive change? Do you have excellent Data Science skills alongside the ability and desire to work with users of data and to communicate well?

We are looking for a Data Scientist to be responsible for leading and delivering the goals of the new data platform.. The successful candidate will work on designing and delivering script migration, helping to build up the azure data platform capabilities and will work closely with the Data & Insights Department to ensure cross-organization consistency.

This role requires experience in building data pipelines, deploying models using azure stack, automation of pipelines as well as sourcing and preparation of data working together with the data engineering team. The successful candidate will need to demonstrate capability to work and communicate effectively with others, including stakeholders and thematic teams, to ensure processes are followed, deliverables are aligned to milestones and outputs are built to agreed quality standards.

Key responsibilities include:

  • Delivery of python scripts automating repetitive tasks following closely the environmental methodology provided by experts
  • Re-development and enhancements of scripts to take advantage of the new data platform - Data Bricks capabilities and script orchestration with ADF
  • Delivery of automated outlier detection-based data cleaning algorithms
  • Assistance in 3rd party provisioning and preparation of data
  • Collaboration with cloud team on configuration and best practices of data cloud platform
  • Model deployment
  • Performance tracking scripting and dashboarding
  • Productionising analysis pipelines through a cloud toolset and hosting static and dynamic presentations of the generated insight.

Required skills and experience:

  • Msc/PhD educated in Computer Science, Statistics and Mathematics or similar.
  • At least 2 years of experience using open source programming language for large scale analysis (Python and R, Rspark, PySpark) and relational databases (MongoDB, Parquet, Hive) and using SQL to query databases
  • A strong mathematical and statistical background with a deep understanding of statistical inference, experimental design, sampling, and simulation
  • Strong experience in the training and production of machine learning models using both structured and unstructured data in big data pipelines, in Azure
  • Experience with well-known code libraries for data preprocessing (pandas, dplyr, tidyr, , scipy, feature-engine, beautiful soup, scrapy, spacy, nltk, TextBlob, fastText, polyglot, requests, json, functools).
  • Good technical communication & presentation skills in English.
  • Be able to work in a matrix environment within a virtual team.
  • Excellent problem-solving skills.
  • Strong Project experience with NLP, text analytics and other relevant areas (e.g. text classification, topic detection, information extraction, Named Entity recognition, entity resolution, Question-Answering, sentiment analysis, event detection, language modelling).
  • Experience with managing and deploying models using Azure Data Bricks, Azure Data Lake, Azure Data Factory.
  • Experience with version control and shell scripting

Desired skills and experience:

  • Experience working as part of a scrum team.
  • Experience with automatic testing scripts and environmental promotion pipelines (e.g. azure DevOps pipeline).
  • A good understanding of GHG and sustainability data.
  • Knowledge of the financial system and capital markets.
  • An awareness of environmental issues, particularly as they relate to our core themes of Climate Change, Deforestation and Water Security.
  • Ambition to start to enable and coach colleagues as part of an expanding organization with growing data science organization.
  • Excellent data visualization skills using Power BI or similar tools.

Interested applicants must be eligible to work legally in the United Kingdom

Further details

This is a full-time, permanent role based at CDP’s Central London office with options for remote and flexible working.

Salary and benefits: £35,000 - £40,000 per annum depending on experience, 30 days annual leave plus bank holidays, annual discretionary bonus, non-contributory pension scheme within 3 months of joining of 8% of salary, Employee Assistance Program, life assurance policy, training and development, discretionary annual bonus scheme depending on CDP’s overall performance.

Interested applicants must be eligible to work legally in the UK.

How to apply

To apply please upload your CV and a covering letter as an additional document in the application form setting out how you meet the required skills and experience or key responsibilities, which should be no more than two pages. The deadline is 13th December. 

Additional information

  • Remote status

    Flexible remote

We usually respond within two weeks

Or, know someone who would be a perfect fit? Let them know!

CDP Worldwide - London

EC3R 5AZ
EC3R 5AZ London Directions hr.admin@cdp.net 020 3818 3925

Already working at CDP Worldwide?

Let’s recruit together and find your next colleague.

email
@cdp.net
Teamtailor

Applicant tracking system by Teamtailor