We provide IT Staff Augmentation Services!

It-technology Lead / Sr. Data Engineer Resume

San Francisco, CA

SUMMARY:

  • 13+ years of total IT experience in EDW/BI/ETL/AWS Cloud/Hadoop/Big Data across various projects, involving complete SDLC life cycle: Analysis, Design, Testing and Implementation.
  • 10+ years of experience in developing complex ETL solutions using Informatica Power Center, building data warehouses, Data Mart for clients in major industry sectors like Banking, Manufacturing, Retail, Healthcare, Telecom and Pharmaceuticals.
  • 10+years of extensive experience on extracting data from different sources like Oracle, DB2, SQL Server, XML, Flat Files, Cobol files, SAS Datasets, Confidential objects, and Oracle ERP sources.
  • 2+ years hands on experience in design, develop and maintain BI Solutions using, Amazon S3 & Redshift.
  • Strong Experience in the Business Requirements Analysis, design, conversion of business requirements into high level functional specifications.
  • Ability to interact with various groups ranging from Business Users, Business IM teams, Analysts. Had long lasting relationships with Clients in all previous engagements.
  • Expertise in Extraction Transformation Loading (ETL) process using Informatica consisting of administration, data transformation, data sourcing, mapping, conversion and loading.
  • Performance tuning in Informatica Mappings by identifying the bottlenecks and Implemented effective transformation Logic.
  • Worked extensively with ETL tools Informatica PowerCenter, Ab initio.
  • Hands on experience in Bigdata stack Hadoop, HDFS, HIVE, HBase, Sqoop, Kafka, MapReduce.
  • Strong knowledge multiple cloud technologies including, EC2, S3, RedShift, VPC, EBS, ELB, EMR, DynamoDB, Lambda, Route 53.
  • Good knowledge of NoSQL databases HBase/Cassandra.
  • Good understanding on Reporting tools Business Objects & Cognos.
  • Experience in writing and debugging database stored procedures, functions, cursors, packages and triggers in Oracle & DB2 databases.
  • Experience in writing Unix Shell Scripts.
  • Expertise in ER modeling, Dimensional modeling (Star & Snowflake schema - Ralph Kimball methodology) and Conceptual data model.
  • Excellent Client Interaction, Communication, Documentation, Multitasking, Research, and Interpersonal skills, with proven ability to communicate effectively with customers ; ability to analyze viable alternatives and provide innovative solutions
  • Flexible, enthusiastic and project oriented team player with solid communication and leadership skills to develop creative solution for challenging client needs.

TECHNICAL SKILLS:

DW/Business Intelligence: Informatica PowerCenter (Version 5.1 to 10.1), Ab Initio 1.8 Informatica Data Quality (Version 9.5), Business Objects XI R2, Cognos 8.0, Salesforce Wave/Einstein, Load Sharing Facility (LSF), Control M, TIDAL

Cloud: AWS, EC2, S3, RedShift, VPC, EBS, ELB, EMR, DynamoDB, Lambda Boto 3 SDK for Python

Distributed Systems: Kafka Steams, Hive

No SQL: HBase, Cassandra

Hadoop Stack: HDFS, Sqoop

Databases: Oracle (Version7 to 11g), DB2 9.1, SQL Server (7.0 to 2008), MS-Access

Languages: Python, SQL, PL/SQL, Shell Programming (Ksh, Bsh)

Modeling & GUI Packages: Erwin 7.1, MS Project 2007, MS Visio 2007, SQL *Loader, TOAD for Oracle TOAD for SQL Server, AQT & CVS. MS Office, Visual Basic 6

Operating Systems: Windows 9x/2K/NT/XP, MS-DOS, UNIX, Linux

PROFESSIONAL EXPERIENCE:

Confidential, San Francisco, CA

IT-Technology Lead / Sr. Data Engineer

Responsibilities:

  • Participated in the Requirement analysis, definition and implementation for the ETL Informtica Conversion and the Data Integration for a Core Confidential Redesign Project.
  • Working with Confidential Lead Designers / Architects to gain an understanding of the existing extract, transform and load ('ETLs') running on Oracle DB Packages, defining the requirements for the conversion to Informatica Power Center.
  • Ingesting structured data from RDBMS to HDFS/Hive using Sqoop.
  • Building data pipelines using Kafka, HIVE, Python, Json and ETL.
  • Working with Hive external, internal tables.
  • Building data pipeline ETLs for data movement to S3, then to Redshift.
  • Optimize Redshift Data warehouse by implementing workload management, sort keys & distribution keys.

Environment: Amazon Web Services (AWS) - S3, Redshift, Boto 3 SDK for Python, Informatica PowerCenter 10.1, Salesforce Cloud, Marketing Cloud, Oracle 11g, Business Objects, Wave Analytics, Shell Scripting, TIDAL Scheduler, Perforce for deployments, GIT Hub, Agile Methodology.

Confidential, Newark, CA

Sr. Informatica Consultant

Responsibilities:

  • Working with Architect and Global Supply Chain Analytics team to understand the requirements.
  • Developing / Enhancing data integration processes using Informatica’s PowerCenter for Global Supply Chain team.
  • Building data pipeline ETLs for data movement to S3, then to Redshift.

Environment: Informatica PowerCenter 9.6, Pentaho 5.2, Oracle 11g, OracleR12 ERP, OBIEE, Tableau, Linux, JAMS Scheduler, AWS S3, Amazon Redshift

Confidential, San Francisco

Sr. Informatica Consultant

Responsibilities:

  • Participated in the Requirement analysis, definition and implementation for the Confidential Conversion and the Data Integration.
  • Working with Confidential subject matter experts ('SMEs') to gain an understanding of the existing extract, transform and load ('ETLs') running on SQL Server DTS Packages, T-SQL, PL/SQL, SAS, and define the requirements for the conversion to Informatica PowerCenter.
  • Implemented data integration processes using Informatica’s PowerCenter in accordance with the functional specifications developed, and following Confidential ETL standards and processes.
  • Researched and evaluated the existing ETL Processes and provided the alternative solutions and recommended the most efficient ETL solutions.
  • Designed and Developed the Data Marts for three of the Key HR Business Processes using standard data modeling methodologies.
  • Develop and document of Technical Design for the ETL Processes.
  • Designed and developed Extract-Transform-Load (ETL) routines.
  • Work with SMEs during UAT to resolve and rectify all issues that are uncovered.
  • Provide support for Confidential 's production environment until all data integration processes have been successfully deployed into the Confidential 's production environment.
  • Assist with the deployment of tested and validated PowerCenter data integration processes.
  • Document all processes and architecture, and conduct knowledge transfer sessions with Confidential personnel.

Environment: Informatica PowerCenter 9.1, Oracle 11g, DB2, Teradata, SQL Server 2008, SAS, SQL, PL/SQL, T-SQL, Linux, TFS for Deployments, Load Sharing Facility (LSF), BMC Remedy (PAC2000)

Confidential, Sunnyvale

ETL Lead Consultant

Responsibilities:

  • Working with DW Analyst/Architect and Business Systems Analyst to understand the requirements.
  • Developing data integration processes using Informatica’s PowerCenter in accordance with the Project Vision document.
  • Developing Informatica Data Quality ETL jobs to do a Data Profiling.
  • Developed IDQ jobs using different Transformations such as Merge, Match, Parser, Decision, Key Generator, Association, then migrating into Informatica PowerCenter to use as a PowerCenter Object.
  • Designed and developed complex ETLs with high volume of data, including performance tuning.
  • Performing Informatica Administration tasks such as: Migration of the ETLs between the Confidential Repository folders and environments (DEV, TST, PRD), Folder creations, creating Users and granting Permissions, etc.,

Environment: Informatica PowerCenter 9.5, Informatica IDQ 9.5, Oracle 11g, OracleR12 ERP, OBIEE, DAC, Linux

Confidential, San Jose, CA

ETL Consultant/Team Lead

Responsibilities:

  • Working in Onsite/Offshore model by managing/coordinating a team of BI/ETL Developers.
  • Involved in the ETL development tracks of various sub-projects and facilitate controlled data movement across platforms.
  • Analysis of requested changes in current processes and implementing the enhancements in existing ETL Informatica environments.
  • Awarded by Confidential management for successful development and execution of ETLs in a specific Business subject area which was in a high priority and much needed business visibility.
  • Leading/Involved the complete ETL development of extracting the Confidential CRM data from Confidential objects with Informatica PowerCenter 8.6.
  • Involved in integration of “BMI Quote Management System” with “ Confidential Opportunity Management”.
  • Developed the ETL with Web Consumer Transformation to pull the Master Data from Sales Force from the Web Services.
  • Business Analysis to identify and understand underneath project related Confidential, BMI, Marketing process and fit that in BI framework.
  • Involved in full and incremental migration and change management process of ETL and database schemas in multiple environments (DEV, UAT, TST and PRD)
  • Coordinating development effort with offshore development team and responding to queries or issues from development team.
  • Responsible for migration of the ETL code to the Production Informatica Repository for entire ETL development/enhancements.
  • Involved in Informatica Administration tasks such as: Upgrading Informatica Power Center 8.1 to Informatica PowerCenter 8.6, Folder creations, Permissions etc.,

Environment: Informatica 8.6, Business Objects XI R2, Oracle 10g, OBIEE 10.1.3.2, SQL, PL/SQL, Linux

Confidential, Cleveland, OH

ETL Consultant

Responsibilities:

  • Responsible for coordinating with the Business Analysts to understand AG’s business, functional requirements of RPG/PPY data conversion to AG data requirements and implemented the same into an ETL Technical design.
  • Responsible for the complete RPG & PPY conversion ETL Technical design into development of ETL’s to Extract, Convert and Load 3 years of Historical Transactional Data of RPG/PPY into AG’s Data Warehouse.
  • Implemented error handling approach as part of ETL logic. (I.e. how data errors will be trapped, reported, researched, corrected and recycled).
  • Involved in assessment of data availability and data quality (both RPG and PPY), including business’s view of data quality requirements for each category of data (i.e. RPG/PPY data to AG data requirements).
  • Developed complex Aggregation calculations through Mapplets.
  • Responsible for writing/documenting the Unit Test Cases with different testing scenarios to meet all the complex business rules implemented in ETL mappings.
  • Developed stored procedures and calling them into Stored Procedure transformation to create the sequence in Data warehouse and Data Mart Tables.
  • Documentation of ETL Mapping Specifications.
  • Analyzing the data feeds (through Flat File & COBOL Sources) by working with Mainframe Team
  • Adhered to the quality code standards described by the client.

Environment: Informatica 8.6, Cognos 8.0, DB2 9.1, SQL, PL/SQL, AIX- UNIX.

Confidential, Columbus, OH

ETL Consultant

Responsibilities:

  • Responsible for coordinating with the Business Analysts and users to understand business and functional requirements and implement the same into an ETL design document.
  • Working closely with business users and technical teams in a customer facing role to bring requirements together into a cohesive design.
  • Developed Functional Specifications for business Process Refinement and Automation.
  • Conducted Joint Application Development (JAD) sessions and interviewed Subject Matter Experts (SMEs), asking detailed functionality aspects of business process and carefully updating the information to the requirements in an easily understandable format.
  • Extensively involved in Documentation of ETL Mapping Specification, Data Mart Logical & Physical Model, Data Mart Data Dictionary, and all other ETL Process documents.
  • Adhered to the ETL best practices and the quality standards described by the client.
  • Identifying the issues and working closely with Business Users and technical teams to resolve the issues.
  • Enhancements to the existing data sources to meet the customer’s requirements.
  • Analyze new data feeds.
  • Performance tuning in Informatica Mappings by identifying the bottlenecks and Implemented effective transformation Logic.
  • Worked with DBA team to fix performance issues in ETL programs.
  • Developed stored procedures and functions to handle various activities outside of Informatica ETL process. Created Functions & Procedures for the ETL process and for Data Validations between source and target tables.

Environment: Informatica 8.6, Cognos 8, SQL, PL/SQL, DB2, UNIX, Enterprise Manager.

Confidential, Chicago, IL

System Analyst/ETL Consultant

Responsibilities:

  • Understanding business need, requirements gathering, developed Conceptual data model.
  • ETL Processes (Development/enhancements of ETL routines) were implemented using Informatica PowerCenter, Oracle, Windows and UNIX environment.
  • Identified and implemented any transformations or conversions required to maximize consistency and usability of the data.
  • Responsible for coordinating with the Business Analysts and users to understand business and functional requirements and implement the same into an ETL design document.
  • Performance tuning in Informatica Mappings by identifying the bottlenecks and Implemented effective transformation Logic.
  • Created and Monitored Sessions and various other Tasks such as Decision, Email, Assignment, Command etc using Informatica Workflow Manager.
  • Worked on Batch Processing or Scheduling and Shell Scripting.
  • Developed PL/SQL stored procedures and functions to handle various activities outside of Informatica ETL process. Created PL/SQL Functions & Procedures for the ETL process and for Data Validations between source and target tables.
  • Closely worked and assisted the QA team during the test cycles to resolve the issues and bug fixing.
  • Designed and developed ETL specifications for source to target mappings
  • Wrote Data validity scripts are created for the QA process to compare the target data against the source system.
  • Worked with QA team on Full Load Testing, Attributes Testing, ETL Incremental Regression Testing, and Defect Tracking.
  • Wrote UNIX Shell Scripts to execute PowerCenter Sessions/Workflows.

Environment: Informatica 7.1, Cognos 8, SQL, PL/SQL, Oracle 10g, UNIX, Control M.

Confidential

Project Lead

Responsibilities:

  • Developed Extract-Transform-Load (ETL) routines using Informatica PowerCenter 7.1 running in Oracle, Windows /UNIX environment.
  • Designed and developed ETL specifications and source to target mappings.
  • Optimized the Performance tuning of the Informatica mappings and sessions.
  • Developed stored procedures and functions to handle various activities outside of Informatica ETL process.
  • Technically lead the team and deal with technical questions from the team.
  • Worked with QA team on Full Load Testing, Attributes Testing, ETL Incremental Regression Testing, and Defect Tracking.
  • Data validity scripts are created for the QA process to compare the target data against the source system.

Environment: Informatica 7.1, Sybase 12.0, DB2 8.1, SQL, Control M, Business Objects 6.1

Hire Now