We provide IT Staff Augmentation Services!

Sr. Informatica Developer / Data Analyst Resume

5.00/5 (Submit Your Rating)

Saint Paul, MN

SUMMARY:

  • Data analyst/ETL Informatica expert with around 6+ years of total IT experience. Operating as a Data analyst /ETL expert in a wide variety of projects and is skilled in analysis/design/implementation surrounding package enabled business transformation ventures.
  • 6+ years of focused experience in Information Technology with a strong background in Database development and strong ETL skills for Data warehousing using Informatica.
  • Extensive experience in developing ETL applications and statistical analysis of data on databases ORACLE, DB2, Teradata, Netezza, MySQL, PostGreSQL and SQL Server.
  • Superior SQL skills and ability to write and interpret complex SQL statements and also skillful in SQL optimization and ETL debugging and performance tuning
  • Experience in developing of on - line transactional processing (OLTP), operational data store (ODS) and decision support system (DSS).
  • Experience in Inmon and Kimball data warehouse design and implementation methodologies
  • Strong familiarity with master data and metadata management and associated processes
  • Hands-on knowledge of enterprise repository tools, data modeling tools, data mapping, tools, data profiling tools, and data and information system life cycle methodologies.
  • Experience with dimensional modeling and architecture experience implementing proper data structures for analytical reporting from an enterprise data warehouse.
  • Experience with Teradata utilities FastLoad, MultiLoad, BTEQ scripting, FastExport, SQL Assistant.
  • Expertise in SQL and PL/SQL programming and also excellent in Views, Analytical Functions, Stored Procedures, Functions and Triggers.
  • Implemented Change Data Capture (CDC) with Informatica Power Exchange.
  • Used Informatica Power Exchange to access VSAM files also worked on Flat files, JSON and XML files
  • Well versed with data quality features like Analyst, IDD & transformation like Key Generator, Standardizer, Case Converter, Match, Consolidation etc.
  • Applied Address transformation for Address Validation and Standardization.
  • Strong in implementation of data profiling, documenting Data Quality metrics like Accuracy, completeness, duplication, validity, consistency.
  • Have good skills on understanding and development of business rules for its Standardization, Cleanse and Validation of data in various formats.
  • Very strong knowledge of Informatica Data Quality transformations like Address validator, Parser, Labeler, Match, Exception, Association, Standardizer and other significant transformations.
  • Extensively worked on Informatica Power center transformations as well like Expression, Joiner, Sorter, Filter, Router and other transformations as required.
  • Very strong knowledge on end to end process of Data Quality requirements and its implementation.
  • Good experience on Data Warehouse Concepts like Dimension Tables, Fact tables, Slowly Changing Dimensions, DataMart’s and Dimensional modeling schemas.
  • Experience in Data modeling; Dimensional modeling and E-R Modeling and OLTP AND OLAP in Data Analysis. Very familiar with SCD1 and SCD2 in snowflake schema and star schema.
  • Experience in Extraction, Transformation and Loading (ETL) data from various data sources into Data Marts and Data Warehouse using Informatica power center components (Repository Manager, Designer, Workflow Manager, Workflow Monitor and Informatica Administration Console).
  • Strong Experience in developing Sessions/Tasks, Worklets and Workflows using Workflow Manager tools -Task Developer, Workflow & Worklet Designer.
  • Experience in performance tuning of Informatica mappings and sessions to improve performance for the large volume projects.
  • Experience in debugging mappings, identified bugs in existing mappings by analyzing the data flow and evaluating transformations.
  • Good experience in writing UNIX shell scripts, SQL scripts for development, automation of ETL process, error handling and auditing purposes.
  • Experience in AWS (Amazon Web Services), S3 Bucket and Redshift (AWS Relational Database).
  • Expertise on Agile Software Development process.
  • Extensive experience of providing IT services in Retail, Financial and Health care industries.
  • Strong Knowledge of Hadoop Ecosystem (HDFS, HBase, Scala, Hive, Pig, Flume, NoSQL etc.) and Data modelling in Hadoop environment.

TECHNICAL SKILLS

ETL: Informatica PowerCenter10.0.1, 9.6.1, 9.0, 8.1.1, SAP Data Services 4.2

Data Profiling Tools: Informatica IDQ 10.0,9.6.1, 8.6.1

ETL Scheduling Tools: Control M, ESP.

RDBMS: DB2, Oracle 11g/12c, SQL Server 2008/2012, MySQL,PostgreSQL

Data Modeling: ER (OPLTP) and Dimensional (Star, Snowflake Schema);

Data Modeling Tools: Erwin 9.3/7.5

UNIX: Shell scripting

Reporting Tools: SAP BOBJ 4.2,Tableau 9, Cognos 8x/9x

Defect Tracking Tools: Quality Center

Operating Systems: Windows XP/2000/9x/NT, UNIX

Source Management: BitBucket, Visual SourceSafe

Cloud Computing: Amazon Web Services (AWS), S3 bucket, Redshift

Programming Languages: C, C++, PL/SQL

Other Tools: Notepad++, Toad, SQL Navigator, Teradata SQL Assistant, JIRA, Rally

PROFESSIONAL EXPERIENCE

Confidential, Saint Paul, MN

Sr. Informatica Developer / Data Analyst

Responsibilities:

  • Extensively Worked with Business Users to gather, verify and validate various business requirements.
  • Identified various source systems, connectivity, tables to ensure data availability to start the ETL process.
  • Worked with Data modeler and understood business logic and modified Data model for ODS and Data Mart data models.
  • Worked with Business Analysts and end users to correlate Business Logic and Specifications for ETL Development/ Informatica Cloud Development also worked with source team, support team and SME to analyze the source systems data.
  • Created Informatica Cloud to extract, transfer and load data from SQL Server to Cloud based target like s3 bucket / redshift DB.
  • Worked extensively on Source, Target, Lookup, Expression, Filter transformations in Informatica Cloud.
  • Created Design Documents for source to target mappings. Developed mappings to load data files daily to AWS s3 bucket in JSON format.
  • Used UNIX scripting to apply rules on the raw data within AWS s3 bucket and accessed Redshift DB through tools.
  • Created Complex mappings using Unconnected and Connected Lookup, Aggregator and Router transformations for populating target table in efficient manner.
  • Created Stored procedures to use oracle generated sequence number in mappings instead to using InformaticaSequence generator.
  • Created complex Mappings and implemented Slowly Changing Dimensions Type 1 and Type 2 for data loads.
  • Created complex Mappings to implement data cleansing on the source data.
  • Used Mapping Variables, Mapping Parameters and Session Parameters to increase the re-usability of the Mapping.
  • Created source to target mappings, edit rules and validation, transformations, and business rules. Analyzed client requirements and designed the ETL Informatica mapping.
  • Scheduled and Run Workflows in Workflow Manager and monitored sessions using Informatica Workflow Monitor.
  • Tuned performance of Informatica session for large data files by increasing block size, data cache size, sequence buffer length and target based commit interval.
  • Validated and tested the mappings using Informatica Debugger, Session Logs and Workflow Logs.
  • Created detail Unit test plans and performed error checking and testing of the ETL procedures using SQL queries, filtering out the missing rows into flat files at the mapping level.
  • Used UltraEdit tool and UNIX Commands to create access and maintain the session parameter files, data files, scripts on the server.
  • Followed and automated the Acceptance Test Driven Development (ATDD) and Test Driven Development (TDD) for unit tests for Informatica ETL.
  • Scheduled the ETLs using Control M scheduler also customize ETL wrapper scripts.
  • Trained and groomed new resources on domain as well as technical knowledge and all processes
  • Supported the application on the Warranty Period

Environment: InformaticaPowerCenter 10.0,InformaticaPowerExchange 10.0, InformaticaDataQuality10.0, Amazon Redshift, Cognos 10.0, Tableau 10, SQL, PL/SQL, Oracle 11g, TOAD, SQL Server 2012, Autosys, Shell Scripting, Rally, JIRA, PostGreSQL 9.2, Control - M, Autosys, GitHub, AWS S3 bucket, RedShift, HDFS, Hive

Confidential

Sr. Informatica Developer / Data Analyst

Responsibilities:

  • Co-ordination from various business users’ stakeholders and SME to get Functional expertise, design and business test scenarios review, UAT participation and validation of data from multiple source.
  • Defined and developed brand new standard design patterned, ETL frameworks, Data Model standards guidelines and ETL best practices
  • Provided technical design leadership to this project to ensure the efficient use of offshore resources and the selection of appropriate ETL/CDC logic.
  • Performed detailed data investigation and analysis of known data quality issues in related databases through SQL
  • Actively involved in Analysis phase of the business requirement and design of the Informatica mappings.
  • Performed data validation, data profiling, data auditing and data cleansing activities to ensure high quality Cognos report deliveries·
  • Used various transformations of Informatica, such as Source Qualifier Transformation, Expression Transformation, Look-up transformation, Update Strategy transformation, Filter transformation, Router transformation, Joiner transformation etc. for developing Informatica mappings.
  • Developed Informatica mappings for TYPE 2 Slowly Changing Dimensions.
  • Teradata utilizes like FASTLOAD, MLOAD, and FASTEXPORT. Wrote many BTEQ Script for loading data into staging and target table.
  • created BTEQ Script for data quality validation like referential integrity, duplicate, NULL etc.
  • Created sessions and work-flows for the Informatica mappings.
  • Heavily used Informatica Cloud integration using Amazon Redshift connector and integrated data form various sources
  • Configured sessions for different situations including incremental aggregation, pipe-line partitioning etc.
  • Created mappings with different look-ups like connected look-up, unconnected look-up, Dynamic look-up with different caches such as persistent cache etc.
  • Created various Mapplets as part of mapping design.
  • Involved in writing/understanding existing Oracle stored procedures and functions for calling during the execution of Informatica mapping or as Pre or Post session execution.
  • Created effective Test Cases and performed Unit and Integration Testing to ensure the successful execution of data loading process.
  • Documented Mappings, Transformations and Informatica sessions.
  • Analyzed Session Log files in case the session failed to resolve errors in mapping or session configurations.
  • Involved in designing the ETL testing strategies for functional, integration and system testing for Data warehouse implementation.
  • Extensively involved in testing by writing some QA procedures, for testing the target data against source data.
  • Written Unix shell scripts for file manipulation, ftp and to schedule workflows.
  • Co-ordinated offshore team on daily basis to leverage faster development.

Environment: InformaticaPowerCenter 9.6.1,InformaticaPowerExchange 9.6.1, InformaticaDataQuality 9.6.1, Amazon Redshift, Cognos 9.0, Sun Solaris, SQL, PL/SQL, Oracle 11g, TOAD, SQL Server 2012, Shell Scripting, Rally, JIRA, Teradata 14, Control - M, Autosys, GitHub, Hadoop, Hive, Cognos 10

Confidential

Informatica Developer / Data Analyst

Responsibilities:

  • Actively involved in interacting with business users to record user requirements and Business Analysis.
  • Translated requirements into business rules & made recommendations for innovative IT solutions.
  • Outlined the complete process flow and documented the data conversion, integration and load mechanisms to verify specifications for this data migration project.
  • Parsing high-level design spec to simple ETL coding and mapping standards.
  • Worked with PowerCenter Designer tools in developing mappings and Mapplets to extract and load the data from flat files and Oracle database.
  • Maintained warehouse metadata, naming standards and warehouse standards for future application development.
  • Created the design and technical specifications for the ETL process of the project.
  • UsedInformaticaas an ETL tool to create source/target definitions, mappings and sessions to extract, transform and load data into staging tables from various sources.
  • Responsible for mapping and transforming existing feeds into the new data structures and standards utilizing Router, Lookups Using Connected, Unconnected, Expression, Aggregator, Update strategy & stored procedure transformation.
  • Worked onInformaticaPowerCenter tool - Source Analyzer, Data Warehousing Designer, Mapping Designer & Mapplets, and Transformations.
  • Worked with slowly changing dimension Type1, Type2, and Type3.
  • Maintained Development, Test and Production Mappings, migration using Repository Manager. Involved in enhancements and Maintenance activities of the data warehouse.
  • Performance tuning of the process at the mapping level, session level, source level, and the target level.
  • Utilized Informatica IDQ to complete the initial data profiling and matching/removing duplicate data for the process of data migration from the legacy systems to the target Oracle Database.
  • Designed and developed Informatica DQ Jobs, Mapplets using different transformation like Address validator, matching, consolidation, rules etc for data loads and data cleansing.
  • Extensively used Informatica Data Quality tool (IDQ Developer) to create rule based data validations for profiling.
  • Used Teradata utilizes like FASTLOAD, MLOAD, and FASTEXPORT also wrote many BTEQ Script for loading data into target table.
  • Created dictionary tables using IDQ analyst tool for data validations.
  • Implemented various new components like increasing the DTM Buffer Size, Database Estimation, Incremental Loading, Incremental aggregation, Validation Techniques, and load efficiency.
  • Strong on Exception Handling Mappings for Data Quality, Data Cleansing and Data Validation.
  • Worked with SQL*Loader to load data from flat files obtained from various facilities.
  • Created Workflows containing command, email, session, decision and a wide variety of tasks.
  • Developed Parameter files for passing values to the mappings for each type of client
  • Scheduled batch and sessions within Informatica using Informatica scheduler and also wrote shell scripts for job scheduling.
  • Understanding the entire functionality and major algorithms of the project and adhering to the company testing process.

Environment: Informatica PowerCenter 8.6, Informatica IDQ 8.6, SQL Server 2008, Oracle 10g, Teradata V2R5, Shell Scripts, Erwin 7.5, TOAD, UNIX, Cognos 9, SQL, PL/SQL, UNIX, Toad, SQL Developer, HP Quality Center

Confidential

Informatica Developer / Data Analyst

Responsibilities:

  • Responsible for design and development of Salesforce Data Warehouse migration project leveraging Informatica PowerCenter ETL tool.
  • Documented Data Mappings/ Transformations as per the business requirement.
  • Designed and developed complex ETL mappings making use of transformations like Source Qualifier, Joiner, Update Strategy, Lookup, Sorter, Expression, Router, Filter, Aggregator and Sequence Generator transformations.
  • Used SQL tools like TOAD to run SQL queries to view and validate the data loaded into the warehouse.
  • Performed data integration and lead generation from Informatica cloud into Salesforce cloud.
  • Created summarized tables, control tables, staging tables to improve the system performance and as a source for immediate recovery of Teradata database
  • Extracted the Salesforce CRM information into BI Data Warehouse using Force.com API/Informatica on Demand to provide integration with oracle financial information to perform advanced reporting and analysis.
  • Created Stored Procedures to transform the Data and worked extensively in T-SQL, PL/SQL for various needs of the transformations while loading the data into Data warehouse.
  • Developed transformation logic as per the requirement, created mappings and loaded data into respective targets.
  • Used pmcmd command to run workflows from command line interface.
  • Responsible for the data management and data cleansing activities using Informatica data quality (IDQ).
  • Involved in writing Oracle PL/SQL procedure and functions for calling during the execution of Informatica mapping or as Pre or Post session execution.
  • Performed data quality analysis to validate the input data based on the cleansing rules.
  • Responsible for determining the bottlenecks and fixing the bottlenecks with performance tuning.
  • Extensively worked on Unit testing for the Informatica code using SQL Queries and Debugger.
  • Used the sandbox for testing to ensure minimum code coverage for the application to be migrated to production.
  • Used PMCMD command to start, stop and ping server from UNIX and created UNIX Shell scripts to automate the process.
  • Improved performance testing in Mapping and the session level.
  • Worked with UNIX shell scripts extensively for job execution and automation.
  • Coordinated with Autosys team to run Informatica jobs for loading historical data in production.
  • Created XML, Autosys for the developed workflows.

Environment: Informatica PowerCenter 8.1, Teradata V2R5, Business Objects 6.0, Oracle 10g, Teradata V2R5, SQL Loader, PL/SQL, SQL Server 2004, UNIX Shell Programming, Autosys, Linux and Windows NT

We'd love your feedback!