Role details

This position is now filled

IT Solution Developer - Big Data

Toronto   •  Permanent

Bullet points

  • A Leading Canadian Financial Institution
  • Big Data Solutions

About Our Client

Leading Canadian Financial Institution

Job Description

  • Part of a team that develops and supports the overall movement of data from systems through the IBM toolset/solution (ETL architecture).
  • Develop data services and interfaces including API calls to core TD systems.
  • Act as a technical coordinator regarding the preparation of the development environment and ensure its integrity during the project,
  • Perform merging and release management activities, such as the packaging of releases for a deployment, while ensuring the integrity of commonly used code/scripts in multiple releases.
  • Design, develop, and manage ETL jobs/services using IIS Data Stage.
  • Ensure defect free programming by testing and debugging using available/appropriate tools and participates in reviewing peer coding.
  • Develop and support reporting in Cognos and other reporting systems.
  • Performing analysis, design, and programming, following system development life cycle (SDLC) methodology while adhering to bank technology standards
  • Develop the systems/process that supports the movement of data including data masking and encryption.

The Successful Applicant

  • Post secondary degree: Computer Science, Engineering or similar degree preferred
  • M.S. or PhD in a relevant technical field: Statistics, Mathematics, Computer Science and Machine Learning
  • A minimum of 3-5 years of experience in statistical analysis, machine learning and analytics related roles
  • Significant experience with Hadoop projects, distributed computing, data and systems management and Hadoop components such as Hive,, etc.
  • Exposure to Apache Spark using Scala/Python and machine learning with ML-Lib is a preferred asset
  • Experience with one or more of the following is a definite plus:
    • Graph Processing
    • Text Processing (or NLP)
    • Time series analysis (or stream analysis)
  • Knowledge of Unix/Linux is essential: Hadoop utilities, Java, and virtual environments;

What's on Offer

A Competitive Package