We provide IT Staff Augmentation Services!

Informatica Data Engineer Resume

3.00/5 (Submit Your Rating)

West Chester, PA

SUMMARY

  • Skilled IT Professional with 5+ years of experience in Requirement Analysis, Design, Development, Testing, Implementation, Documentation and Support of various Data Movement/ Data Warehousing Applications.
  • Proficient in leveraging industry standard Data integration tools for BI and Data Migration efforts.
  • Excellent experience in extraction, transformation and loading the data from various source databases/file systems to target Data Marts.
  • Adept at employing Informatica Power Center for ETL exercises. Worked extensively with all Informatica Designer tools - Source Analyzer, Target Designer, Mapping Designer, Mapplet Designer and Transformation Developer; Used Informatica Workflow Manager and Workflow Monitor to build and run workflows.
  • Developed several mappings with complex transformation logic using components such as connected and unconnected Lookup, Stored procedure, HTTP, Filter and Expression to extract data from diverse sources including flat files, RDBMS tables, legacy system files and REST APIs.
  • Experience with Database Systems like Oracle 12C/11g, MS SQL Server 2014, DB2 10.x (DB2 LUW / DB2 zOS), Teradata TD 15/14.
  • Comprehensive understanding of Data Warehousing concepts; Relational databases, Entity Relation Diagrams, Normalization, Kimball/Inmon methodologies, Star/Snowflake schema, Facts and Dimensions, Slowly Changing Dimensions, Change Data Capture, Incremental Aggregation etc.
  • Followed a unit testing check list for catching common development errors. Collaborated with QEs on data validation activities. Assisted in developing scheduled(daily/weekly) quality control checks for high visibility tables in the mart.
  • Involved in multiple Full Life Cycle Developments of building various Data Warehouse and Datamart applications serving a variety of Decision support systems and Enterprise Business Intelligence solutions.
  • Experience in Performance tuning of data flow with Mapping Optimizations in Informatica.
  • Worked closely with IT Architects, Business Analysts, and end users for finalizing requirements and signing off mapping document for data flow.
  • Familiarity with distributing work among team members and setting reasonable goals for task completion.
  • Well versed with offshore - Onsite model.
  • Experience in UNIX/Linux working environment, writing UNIX shell scripts for Informatica pre & post session operations, pmcmd scripts, database archival scripts etc.
  • Followed Linux script packaging methodology under Red Hat Package Management for deploying Linux scripts to different environments.
  • Created change record, assigned tasks to responsible workgroups, and spearheaded several deployments into production.
  • Served as a key resource in a Data Warehouse migration effort that involved migrating the production workflows to a new Informatica repository and packaging the Linux scripts to a new server.
  • Part of an Agile team that used JIRA/Confluence with daily scrum, weekly backlog grooming and bi-weekly planning/retrospective. Created stories as per the agreed upon template.
  • Possess good interpersonal, presentation and development skills with strong analytical and problem-solving approach. Excellent Team Player and a self-motivated individual with good communications skills.
  • Bachelor of Technology in Computer Science Engineering.
  • Authorized to work in US for any employer without sponsorship.

TECHNICAL SKILLS

ETL: Informatica Power Center 10.x/9.x Repository Manager, Designer, Server manager, Workflow Monitor, Workflow Manager, IDQ, SAS IT Resource Management 3.2

Databases: Oracle 12c/11g, DB2 10.x, SQL Server 2014, Teradata 15/14, MySQL

Scheduling tools: Stonebranch, Skybot, Control-M

Environment: Windows, UNIX, Linux

Querying Tools: IBM Optim, Data Studio, TOAD, SQL Assistant, SSMS, AQT

Packages: HP Service Manager, HP ALM, Red Hat Package Manager, Access, Visio, JIRA

Reporting Tools: Cognos 10/8.4, SAS EG 6.1, Qlikview

Languages: SQL, PL/SQL, C, Shell scripting

PROFESSIONAL EXPERIENCE

Confidential, West Chester, PA

Informatica Data Engineer

Responsibilities:

  • Analyze the requirement at hand and develop Informatica mappings to extract data from source systems (DB2 z/OS, DB2 LUW, Oracle DW, Host files/ SAS Data sets) to a Landing zone on Informatica server.
  • Use Teradata Parallel Transporter/FastLoad features to import data from the Landing zone to intermediary tables in Teradata Staging area.
  • Use Pushdown Optimization feature (Full PDO) exclusively to import data from Teradata staging area, apply the necessary informatica transformations to transform the data according to the business need and load in the EDW Core tables.
  • Worked on Teradata EXPLAIN plans, Primary Indexes (PI, NUPI), Secondary Indexes (USI, NUSI), Partition Primary Index (PPI), Join Indexes (JI), Collect Stats, Global/Temporary/Derived/Volatile tables as applicable to the given requirement.
  • Prioritize work from backlog in JIRA board, estimate the complexity of each story and assign story points accordingly in Sprint planning, assign sub-tasks to each story as part of sprint grooming and discuss the progress of the stories in daily Scrum stand ups.
  • Create Enterprise Database Change Management (EDCM) requests for creating new tables or elevating tables to higher environments.
  • Create source to target mapping documents and TMMs reflecting the code in each Informatica mapping developed, get the document peer reviewed and approved by the test analyst.
  • Create SVN/Jenkins build scripts to package the workflows/parameter files and migrate to QA environment.
  • Assist in creating test cases, write test scripts and upload test results to HP Application Lifecycle Management tool.
  • Create Change records in HP Service Manager for promoting the Informatica workflows, scripts, and parameter files to production. Attach the session/workflow logs, HP ALM test case links, Jenkins build script deployment packages and Backfill SQL scripts to the respective tasks in the change record.
  • Establish connections to source and target production environments from SAS server. Use SAS Enterprise Guide to write SAS programs to import and compare data between source and target. Generate a SAS report with the results of the comparison, give aggregates/percentage of match and mismatch records. Explain the possible causes if the percentage mismatch is greater than specified threshold.
  • Work with Product Owners/Business partners as part of User Acceptance Testing (UAT) and get signoff on the SAS comparison report. Promote the tables/views to EDW sematic layer after signoff.
  • Engage in production support tasks, work on incident tickets as necessary.

Environment: Informatica Powercenter 10.1, Informatica Metadata Manager 10.1, SAS Enterprise Guide 6.1, Linux Shell, Teradata Client 15.1, Teradata MLoad, FLoad, TPump, IBM Data Studio, Advanced Query Tool v9, TOAD 10.x, SQL Assistant 7.x, StoneBranch, Microstrategy, Tableau, FileZilla FTP client, WinSCP, Tortoise SVN, GitHub/Jenkins, Netezza 7.2, Oracle 11g, MS SQL Server 2012, Db2 LUW, Db2 z/OS, HP ALM

Confidential, Natick, MA

Data Analyst

Responsibilities:

  • Analyze the requirements provided by various business users and create data mapping documents
  • Develop system components using Informatica Designer 10.x Tool to load the data from multiple source files to the intermediate staging area and then to the reporting tables in the Data warehouse.
  • Use Teradata utilities (Fastload/TPump) where necessary. Use Full Push down optimization when loading from staging tables to reporting tables.
  • Extensive use of Stored Procs and SQL functions in Queries.
  • Configure deployment groups using Informatica Repository Tool to integrate all the ETL objects like Source definitions, Target definitions, Transformations, Mappings, Sessions, Workflows, etc. to be migrated to Production environment.
  • 24/7 on support call to make sure all the jobs are executing fine in the production.
  • Create flexible mappings/sessions using parameters, variables, and heavily used parameter files.
  • Analyze the session logs, bad files and error tables for troubleshooting mappings and sessions.
  • Tune the performance of mappings where the goal is to insert millions of rows to target by identifying bottleneck at target, source, mapping, and session level.
  • Develop and document Data Mappings/Transformations, and Informatica sessions as per the business requirement.
  • Work with QA in developing a comprehensive quality control check that runs after the ETL job and sends out a report if any errors are found.

Environment: Informatica Power Center 10.x, Oracle 11g, MS SQL server 2014, SQL Assistant, Unix Shell scripting, Windows 10, JIRA, Teradata utilities

Confidential, Bloomington, IL

ETL Developer

Responsibilities:

  • Developed mappings in Informatica PowerCenter to load the data from various sources using transformations like Source Qualifier, Expression, Lookup (connected and unconnected), Aggregator, Update Strategy, Filter, Router etc. into a central repository on DB2
  • Created parameter files in Informatica PowerCenter and passed them to Informatica PowerCenter Tasks.
  • Tuned Informatica PowerCenter mappings for better Performance.
  • Extensively used SQL Developer, IBM Optim/Data Studio, SSMS to query tables and access data.
  • Wrote UNIX scripts for cleansing data and pre/post processing wherever preferable.
  • Responsible for distributing work to off-shore team, coordinate work progress, verify the off-shore code and the corresponding test cases.
  • Created connection strings for various source and target databases.
  • Worked closely with the end users and Business Analysts to understand the business and develop the transformation logic to be used in Informatica PowerCenter.
  • Sourced SQL Server, Oracle, DB2/Zos and XML data and transformed it as per business requirement to Load into an Operation Reports Repository and Operational Tooling Data Store residing on DB2.
  • Designed and documented validation rules, error handling routines and testing strategies for the mappings.
  • Involved in service work including but not limited to monitoring production workflows with jobs running 24/7
  • Developed LoadScripts, RunStats, PurgeScrips etc for the data Inserts/Manipulations in to the DB2 system.
  • Provided Ad hoc Excel reports for less intensive business needs.

Environment: Informatica PowerCenter 9.6, DB2, SAS ITRM, Oracle 11g, SQL Server 2012/2008, IBM Optim, Data Studio, SQL Developer, SSMS, Cognos 10/8.4, SAS EG 4.3, Shell Scripting, Control-M, Rumba-Micro Focus

Confidential

Software Developer

Responsibilities:

  • Create and maintain reporting data sets for internal metrics
  • Generate on-demand excel reports from SQL Server result sets
  • Develop automated processes to load data from csv files into database tables
  • Create linked serves to connect to other databases, Load from the linked server database to temp tables in SQL server
  • Create SQL server agent jobs to fetch data from Finacle files and Bulk Insert into database tables
  • Create customized views as per user requirement and to limit access to sensitive data
  • Write T-SQL stored procedures for data transformation/manipulation and schedule them using agent
  • Write Windows batch scripts to move/copy files and manipulate data in files as needed

Environment: MS SQL Server 2008, SSMS, T-SQL, Sharepoint, MS Office Suite, Windows XP, Oracle10g, WinSCP

We'd love your feedback!