We provide IT Staff Augmentation Services!

Informatica Developer Resume

5.00/5 (Submit Your Rating)

PROFILE:

  • 8+ years of hands - on programming experience with 4+ years in Hadoop platform
  • Proficiency in data analysis and strong SQL skills.
  • Knowledge of various components of Hadoop ecosystem and experience in applying them to practical problems - Hive/Impala/Spark-Scala/MR.
  • Proficiency with shell scripting & Python
  • Experience in data warehousing, ETL tools, MPP database systems
  • Experience working in HIVE & Impala & creating custom UDFs and custom input/output formats /serdes
  • Ability to acquire, compute, store and process various types of datasets in Hadoop platform
  • Understanding of various Visualization platforms (Tableau)
  • Experience with Scala.
  • Excellent written and verbal communication skills

TOP SKILL SETS / TECHNOLOGIES:

SQL

Hive/(Talend or Pentaho or Informatica or similar ETL) /Impala/MapReduce/Spark

Unix

Scala / Python / Java

ETL /Data warehousing

RDBMS/Data Modelling

Tableau/Qlikview

SKILL REQUIREMENTS:

Informatica Developer

Confidential

Responsibilities:

  • Design and Build distributed, scalable, and reliable data pipelines that ingest and process data at scale and in real-time.
  • Data Analysis and Data exploration
  • Explore new data sources and data from new domains
  • Productionalize real time/Batch ML models on Python/Spark
  • Evaluate big data technologies and prototype solutions to improve our data processing architecture.

Data Analyst

Confidential

Responsibilities:

  • Data Analysis and Data exploration
  • Design and Build distributed, scalable, and reliable data pipelines that ingest and process data at scale.
  • Explore new data sources and data from new domains
  • Build feature computation pipelines for various ML & rule based models.
  • Build exploratory dashboards in Tableau
  • 8+ year experience & proficiency in data analysis
  • Strong SQL skills.
  • Knowledge of various components of Hadoop ecosystem and experience in applying them to practical problems - Hive/Impala/Spark/MR.
  • Proficiency with shell scripting & Python
  • Experience in data warehousing, ETL tools, MPP database systems
  • Flair for data, schema, data model
  • Understanding of data ware housing concepts.
  • Understanding of various Visualization platforms (Tableau, Qlikview, others)
  • Experience with Java or Scala will be a big plus.
  • Excellent written and verbal communication skills

Top skill sets / technologies:

  • SQL
  • Hive/Impala/MapReduce/Spark
  • Unix
  • Scala / Python / Java
  • ETL /Data warehousing
  • RDBMS/Data Modelling
  • Tableau
  • Machine Learning Knowledge
Data Engineer - Hadoop

Confidential

Responsibilities:

  • Proficiency in Apache SOLR (experience of projects in production)
  • In depth knowledge of DB concepts
  • Good understanding of Hadoop
  • Proficient in Java
  • Source Control experience (preferably Git Hub)
  • Experience in Scala
  • Experience e in Lucid works Fusion
  • Experience in Hive, Impala and other Hadoop eco systems
  • Experience in Spark & Machine Learning
  • Good understanding of RDBMS & AQL
  • Writing efficient search queries against Solr indexes using Solr REST/Java API
  • Prototype and demonstrate new ideas of feasibility and leveraging for Solr capabilities
  • Write Hive scripts to aggregate/transform core data and prepare for Solr indexing
  • Design, create and manage shards and indexes for Solr cloud.

Data Modeler

Confidential

Responsibilities:

  • Extensive data modeling experience - Normalized ER modeling and dimensional modeling. In dimensional modeling, the resource should be able to demonstrate hands-on experience in designing slowly changing dimensions, understanding and implementation of the CDC process, metrics design.
  • Should be comfortable modeling right from the conceptual modeling to logical and physical modeling.
  • Should be able to demonstrate sound data modeling standards knowledge, approach to requirements gathering, physical model translations etc.
  • Should show in-depth skills in database layer, semantic layer architectures.
  • In Depth understanding of data warehouse architecture. Expertise with Hadoop (Hive, Spark, Scala, Impala), Teradata is preferred.
  • Extensive experience in using data modeling tool, PowerDesigner. This is to include the following:
  • Creating models from scratch
  • Reverse engineering databases and structures from flat files.
  • Creation of multiple physical models from single logical models
  • Database creation & DDL generation
  • Building data dictionary
  • PowerDesigner model reporting
  • PowerDesigner repository setup and management
  • Should be able to demonstrate sound data modeling standards knowledge, approach to requirements gathering, physical model translations etc.
  • An aptitude for data discovery, mining and translating business understanding to detailed requirements for data architecture.

Tableau Developer

Confidential

Responsibilities:

  • 5+ years of professional experience in building dashboards, scorecards using Tableau
  • Hands-on professional with thorough knowledge of scripting, data source integration and advanced dashboard development in Tableau
  • Full understanding of the processes of data quality, data cleansing and data transformation
  • Ability to write complicated and efficient SQL queries
  • Strong knowledge on Tableau server architecture and applying business rules and data validations
  • Experience in end-to-end implementation of Business Intelligence (BI) projects, especially in scorecards, KPIs, reports & dashboards
  • Background in software/web application development with tools such as Java, Python
  • Comfortable in manipulating and analyzing complex, high-volume, high-dimensionality data from varying sources
  • Knowledge of formal database architecture and design. Good SQL experience (preferably Teradata and Hadoop)
  • Visualization and UX experience
  • Ability to identify key business requirements and addressing through innovative data-visualization solutions
  • Ability to effectively interact with and present information to and respond to questions from business partners
  • Ability to communicate complex analysis in a clear, precise, and actionable manner
  • Ability to research and troubleshoot technical problems
  • Capital Markets, Wealth Management (Advisory and Brokerage), Retail Banking. This will require a thorough understanding of investment products concepts, Market Data, etc.
  • Retail banking knowledge would involve knowledge on concepts like Loans, Mortgage, HELOC etc.
  • Understanding of securities data.

Informatica Developer

Confidential

Responsibilities:

  • 8+ years of extensive knowledge and experience in ETL development
  • Knowledge of Informatica
  • Knowledge of Scripting languages (Unix, Shell Scripting)
  • Teradata and Hadoop knowledge
  • Strong background in data warehouse architecture, methods, concepts, techniques, dimensional modelling
  • Experience in scheduling tool (Autosys, TWS)
  • Extensive technical understanding that spans multiple platforms and application level expertise of a portfolio of applications with broad knowledge of the business strategic priorities, in order to resolve complex problems.
  • Performance and scalability tuning of ETL jobs to support large deployments
  • Ensure quality and completeness of the final product through unit testing, documentation and maintenance as appropriate.
  • Eager to work with new technologies and apply them towards enterprise-level data solutions
  • Ability to understand data requirements and able to create data mapping documents
  • Experience in defining data governance processes to improve the quality of the data
  • Responsible for analysis, scope and time estimation of ETL changes required to support complex report and dashboard requirements
  • Leading a team of developers working on one or multiple projects of varying complexity, undertaking the following activities, maintaining overall accountability for the work of the team in terms of quality and timeliness of delivery
  • Financial Services experience will be a strong plus
  • Knowledge of scripting languages Python

We'd love your feedback!