We provide IT Staff Augmentation Services!

Etl Architect (informatica/teradata/hadoop) Resume

0/5 (Submit Your Rating)

Newark, NJ

SUMMARY

  • An accomplished Big Data and DWH/BI Senior Consultant with demonstrated success in Gathering Requirements, Estimation, Analysis, Design, Development and Support of Big Data (Hortonworks) and BI/DWH solutions (Informatica / IBM Cognos) that improves Business Functionality and Decision - Making.
  • Experience in Integrating Data (ETL) from different platforms into HDFS, Hive, Data Hub, ODS and Data Warehouse.
  • Proven experience in dimensional and relational data modelling in IBM Cognos and Erwin.
  • Extensive experience in Designing/Development of Complex Reports, Dashboards and OLAP cubes for enterprise reporting and analysis. Experience in providing training to End Users and Developers.
  • 12+years of Total IT experience inRequirement Gathering, Design,Development, Testing, Analysis and Support of DWH/BI Solutions in Big Data Eco System, Informatica and IBM Cognos.
  • 2+ years of hands on experience in Big Data Eco System including HDFS, Map Reduce, Spark, Hive, HBase, Pig, Sqoop, Flume, Kafka and Oozie.
  • 8+ years of hands on experience in IBM Cognos BI Tools (Cognos BI 10.x, Cognos BI 8.x).
  • 9+ years of hands on experience in Informatica Tool Sets (PowerCenter 10.x/9.x, BDM 10.1.1, Data Quality 10.x/9.x).
  • 3+ years of experience in Data Modeling using Erwin.
  • 8+ years of experience in Leading/Managing projects using Onshore/Offshore model from US.
  • Excellent understanding for Big Data (Hadoop/Spark) and Data Warehousing concepts.
  • Experience working on Spark SQL, RDD, Data frames and Spark Transformations for ETL.
  • Hands on experience in writing Map Reduce jobs on Hadoop Ecosystem using Pig Latin and creating Pig scripts to carry out essential data cleaning operations and tasks.
  • Experience developing Sqoop scripts in order to import data from RDBMS to HIVE and vice versa.
  • Excellent understanding and knowledge of Data Ingestion/Streaming tools such as Kafka and Flume to gain near real time streaming data into HDFS from different sources.
  • Hands on Experience in performing analytics on structured data in HIVE with Hive queries, Views, Partitioning, Bucketing and UDF’s using HiveQL with Tez.
  • Worked with different file formats like JSON, XML, Avro data files and text files.
  • Experience in Designing Time Driven and Data Driven automated workflows using Oozie (combining multiple jobs for Hive, Pig and Sqoop into one logical unit of work).
  • Having good knowledge and experience on Python 2.7 and PySpark for Spark ETL Programming.
  • Excellent understanding and knowledge of NoSQL databases like HBase, Cassandra and MongoDB.
  • Experience in working with Hadoop in Stand-alone, pseudo and distributed modes.
  • Extensive experience in translating Business Requirement into Technical Specification, HLDs, LLDs, STTM and Test cases.
  • Experience working in Informatica BDM 10.1.1 for Tool based Hadoop ETL Engine design POC.
  • Experience in Designing, Developing, Deploying and Supporting ETL solutions for Data Warehousing (mapping, session and workflow design) in InformaticaPower Center with CDC/SCD loads.
  • Experience implementing Informatica PDO on Teradata for improving ETL loads performance.
  • Experience developing ETL workflows using Teradata utilities such as MLOAD, FLOAD and TPT.
  • Experience implementing Audit/Balance/Control mechanism for Informatica ETL load framework.
  • Experience loading data from various ERPs (Oracle EBS, MfgPro and JD Edwards), Flat Files, HL7, Hierarchal/Relational Data files, XML, Oracle, MS SQL Server, DB2 and MS Access.
  • Experience developing UNIX shell scripts for Informatica Workflow execution and source/target file validation/manipulation.
  • Extensive experience in designing/developing Relational Model in Cognos Framework Manager.
  • Extensive experience designing/developing Report Studio Reports from Cognos packages and Cubes.
  • Extensive experience designing/developing Power Cubes models in Cognos Transformer from Cognos Packages and IQDs.
  • Extensive experience withCognos Access Management and Security (Persona IQ, Cognos Access manager, Object level security and Data security).
  • Working experience on various RDBMS such as MSSQLServer, Oracle 10g/11g and Teradata 15.10.Experience in writing Procedures, Functions on Oracle 10g/11g and Teradata 15.10.
  • Possess good exposure in theRetail, Manufacturingand Healthcare Domains.
  • Experience working in Agile and Waterfall Development Methodology.

TECHNICAL SKILLS

Operating System: Windows, UNIX, Linux

RDBMS/Query Language: Teradata v 15.10/14.10 , Oracle 11g/10g, MS SQL Server 2012/2008, DB2, T-SQL, PL/SQL, BTEQ

Programming Language: Python 2.7 (PySpark)

Big Data Tools: Hadoop, YARN, Spark, PIG, HIVE, Flume, Kafka, Sqoop, Oozie, Tez

Hadoop Distribution: Hortonworks HDP 2.5.3/2.6.4

ETL Tools: Informatica BDM 10.1.1, Informatica PowerCenter 10.1/9.1, Informatica Data Quality 10.1/9.1

Data Modeling Tools: Erwin, Cognos Transformer, Cognos Framework Manager

BI Reporting Tools: Cognos BI 10.2.1/8.4.1 (Cognos Framework Manager, Analysis Studio, Report Studio, Query Studio and Cognos Transformer)

IT Service/Change Management: BMC Remedy 6.0, IBM Clear Quest

Version Management: Subversion, MS SharePoint

Misc Tools: MS Visio 2010, IBM Data Studio 2.2, Putty, WinSCP, PL/SQL Developer, SQL Developer, Teradata SQL Assistant, Teradata Studio Express, MS Management Studio, Tortoise SVN, HUE, Ambari, Jenkins, Visual Studio Code, Eclipse, GIT

PROFESSIONAL EXPERIENCE

Confidential, Newark, NJ

ETL Architect (Informatica/Teradata/Hadoop)

Responsibilities:

  • Performing POC for Hadoop Data Ingestion and ETL pipelines using Informatica BDM 10.1.1.
  • Designing Informatica ETL Architecture for data ingestion in Data Hub and perform Complex Transformations.
  • Preparing HLDs and LLDs based on TRDs provided by Business System Analyst.
  • Involved in creating Hive tables with partitions and buckets, loading data and writing hive queries for Data Analysis.
  • Written Sqoop Scripts to load data from Data Hub to HDFS/Hive tables.
  • Designed/Developed Spark ETL code to transform data using Spark RDD and Spark SQL in PySpark.
  • Extensively used Pig Latin scripts for data cleansing.
  • Performed Unit Testing using PyTest.
  • Review of Additional Informatica ETL, Spark, PIG, Sqoop code and builds developed by offshore.
  • Preparing Deployment plan for SIT, UAT and Production Migration.
  • Defect Management, Tracking and Fix.
  • Production Migration and Post Implementation support.
  • Onshore - Offshore coordination.

Environment: /Tools:Informatica PowerCenter 10.1, Informatica Data Quality 10.1, Teradata v15.10, BTEQ, Hortonworks HDP 2.5.3, Spark, PIG, HIVE, Sqoop, Windows, Linux, Shell Scripting.

Confidential, Newark, NJ

ETL Technical Lead (Informatica/Teradata)

Responsibilities:

  • Conducting Gap Analysis for requirements.
  • Performing POC for MS Access, HL7 Messages and NCPDP File into Informatica ETL Loads.
  • Performing Data profiling using IDQ on source data.
  • Helping Data Architect in Creating Conceptual, Logical and Physical Data model for Data Hub on Teradata v14.10
  • Designing Informatica ETL Architecture for disparate sources to Data Hub load.
  • Preparing HLDs and LLDs based on TRDs provided by Business System Analyst.
  • Designing Audit/Balance/Control Mechanism for ETL load.
  • Review of Informatica ETL code developed by offshore.
  • Preparing Deployment plan for SIT, UAT and Production Migration.
  • Defect Tracking and Fix.
  • Production Migration and Post Implementation support.
  • Onshore - Offshore coordination.

Environment: /Tools:Informatica PowerCenter 9.5.1, Informatica Data Quality 9.5.1, Informatica B2B Data Transformation 9.5.1, Teradata v14.10, BTEQ, Windows, Linux, Shell Scripting, Erwin

Confidential, Galesburg, MI

Technical Lead (Informatica)

Responsibilities:

  • Gathering requirement from Confidential Business Analyst and getting requirement signed off.
  • Conducting Gap/Feasibility Analysis between requirement and technical details of system.
  • Designing the High Level ETL Architecture for the solution.
  • Estimating the effort and preparing project timeline.
  • Creating Data model for Staging, Extract and Audit Layers.
  • Preparing HLDs and STTM documents for offshore development team.
  • Performing Informatica ETL Code Review before SIT.
  • Conducting UAT with Business Users and Zilliant.
  • Production Migration and Post Implementation support.

Environment: /Tools:Informatica PowerCenter 9.5.1, Informatica Data Quality 9.5.1, Oracle 10g, PL/SQL, Windows, Linux, Shell Scripting, Control-M,PL/SQL Developer, Erwin

Confidential, Galesburg, MI

Technical Lead (IBM Cognos)

Responsibilities:

  • Conducting feasibility analysis and POC of the pilot report.
  • Creating High Level Design document.
  • Developing Cognos Framework Manager Model (3 Tier Architecture) for CBOM reporting by offshore team.
  • Applying data security in FM to hide column based on user role.
  • Developing Cognos Transformer Model (.pyj) for Cube.
  • Creating custom views for security.
  • Publishing the cube package for analysis studio.
  • Unit testing and UAT with business users.
  • Production implementation (Moving packages, reports, transformer model and window script to production)
  • Post implementation support.

Environment: /Tools:Cognos BI 10.2.1, Cognos Transformer, Persona IQ, Oracle Exadata, PL/SQL Developer

Confidential, Galesburg, MI

Technical Lead/Senior Developer (Informatica/IBM Cognos)

Responsibilities:

  • Worked with Business Analyst to gather requirements and create HLDs, LLDs and STTM.
  • Designing ETL Architecture and BI Reporting Solution.
  • Developing/Enhancing Data Model for ETL solution.
  • Developing/Maintaining ETL solutions in Informatica Powercenter 9.5.1 to pull data from various source systems (Oracle EBS 11i, Mfg/Pro, RWS and CIS II).
  • Developing/Maintaining Cognos reporting in (DW and Real Time reporting) on various Data Warehouse and ERP systems (Oracle EBS 11i, Mfg/Pro, JD Edwards Enterprise One).
  • Resolving IncidentswithinSLA.
  • Onshore/Offshore Coordination.

Environment: /Tools:Informatica 9.5.1, Informatica Data Quality 9.5.1, Cognos BI 8.4.1, Cognos Transformer, Oracle 10g, PL/SQL, Windows, Linux, Shell Scripting, Control-M,SharePoint,PL/SQL Developer

Confidential, Galesburg, MI

Senior BI/DW Developer (Informatica/IBM Cognos)

Responsibilities:

  • Worked with Business Analyst to gather the requirement created TDDs(Technical Design Document) based on the BRDs (Business Requirement Document)
  • Analyzed various Source systems (Oracle EBS, RWS, and CIS2) to pull the data into Data warehouse.
  • Worked with Data Modelers to create STTM for ETL development.
  • CreatedInformaticaMappings/Workflowsfor loading dimension and fact table in central data warehouse along with offshore team.
  • Implemented CDC (Change Data Capture) in loading the Fact tables.
  • Creation of Stored Procedures in Oracle for use inInformaticamappings.
  • Written SQL query of Pre and Post SQL execution inInformaticaworkflows.
  • Created UNIX scripts to schedule the workflow using Control-M.
  • Designing CognosFramework Model for Vehicle Warehouse.
  • Publishing the packages to Cognos Connection Portal for reporting.
  • Designingcomplex reports inreports in Report Studio.
  • Designing Cubes Models (.pyj) in Cognos Transformer.
  • Designing Cubes View in Analysis Studio.
  • Created Unit Test Cases and performed Unit Testing of the Solution.
  • Production migration of ETL and Cognos Solutions.
  • Conducting UAT and assisted in user training.

Environment: /Tools:Cognos BI 8.4.1, Cognos Transformer 8, Informatica 8.6, Oracle 10g, PL/SQL, Windows, Linux, Shell Scripting, Control-M,PL/SQL Developer

Confidential

BI Developer

Responsibilities:

  • Worked with IT Manager to create Migration Plan.
  • MigratedCognos Impromptu Catalog toCognosFramework Manager.
  • Migratedtransformer model based onIQDs,CSV and TXTto Cognos FM based packages.
  • MigratedIMRs (Impromptu Report)to Report Studios report with Prompt andDrill throughcapabilities.
  • Performed Unit Testing of the Solution.
  • Conducted the User acceptance testing.
  • Migrated all Package, reports and Cubes to production and provided warranty support.

Environment: /Tools: Cognos BI Reportnet 1.1, Cognos Impromptu 7.4, Cognos Powerplay Transformer 7.4, MS Sql Server 2000, DB2, Flat files

Confidential

BI Developer (IBM Cognos)

Responsibilities:

  • Developing and maintaining catalog and user profiles.
  • Developing and maintainingtransformer model based on IQDs,CSVand txt files.
  • Developing and maintaining dimensions measures and levels (Multidimensional model).

Environment: /Tools: Cognos Impromptu 7.0, Cognos Powerplay 7.0, Cognos Powerplay Transformer7.0, DB2, Flat files, Win 2003 Server

We'd love your feedback!