Teradata Analyst

  • Job Reference: HQ00009415
  • Date Posted: 29 November 2017
  • Recruiter: E-Resourcing
  • Location: Brussels
  • Salary: £335.76 to £419.70
  • Sector: IT (general)
  • Job Type: Contract
  • Duration: 6 months
  • Work Hours: Full Time

Job Description

Context of the mission

Our client is looking for an Analyst / Developer experienced in Teradata.
You are part of a squad team that works in an agile framework (scrum). All squad members sit physically together and work on a specific business domain (eg. Network & Customer Experience, Location Insights, Value Analytics, …)

Responsibilities of the Analyst / Developer

  • Responsible for the analysis, development, testing and follow up of the implementation of project deliverables with a focus on quality:
  • Load and integrate data from our multitude of applications, service platforms and networks in the Enterprise Data Warehouse/Data Lake via ETL solutions
  • Creating datamarts and/or reports/dashboards
  • You make sure that your (and your colleagues') solutions are matching the requirements regarding security (retail/wholesale, legal, …) and confidentiality, defined by Legal- and Security- teams and by our Datawarehouse Data Security Officer (DSO)

Profile of the Analyst / Developer

  • Profound knowledge of key concepts of Business Intelligence, Data Warehousing and entity-relationship (E-R)-, Third Normal Form (3NF)- and dimensional modeling (knowledge on RDA helps)
  • Good knowledge on Hadoop and related technologies (Hortonworks Hadoop stack, HBase, MapReduce, Storm, Hive, Kafka, Scala, Spark ...)
  • Knowledge and experience in:
  • Scripting (Perl, Java, Python, R, ...)
  • NoSQL principles and systems: document stores, graph databases, KVP stores
  • Parallel data processing (MPP) and tuning
  • Statistical Modeling
  • BI tools (Informatica, MicroStrategy), databases (Teradata, Oracle)
  • modeling tools (RDA)
  • Experience of working in scrum or another agile mode is an asset.
  • Advanced to expert knowledge of SQL (Teradata)
  • Familiar with shell scripting in Unix/Linux environments
  • Basic knowledge of Hadoop and related technologies (Hortonworks Hadoop stack, HBase, MapReduce, Storm, Hive, Kafka, Scala, Spark ...)
  • Basic knowledge of software deployment/versioning tools (GIT, Jenkins) and of collaboration tools (Jira, Confluence)