Sr. Informatica Engineer / Data Analyst Resume
Minneapolis, MN
SUMMARY
- 10+years of IT experience in all phases of Software Development Life Cycle (SDLC) which includes User Interaction, Business Analysis/Modeling, Design, Development, Integration, Planning and testing and documentation in data warehouse applications, ETL processing and distributed applications.
- Excellent domain knowledge of Health Care, Banking, Finance, Insurance.
- Expertise in using ETL Tool Informatica Power Center 8.x/9.x/10.2 (Mapping Designer, Workflow Manager, Repository Manager, Data Quality (IDQ) and ETL concepts.
- Extensive knowledge in RDBMS, Business Intelligence and Data Warehousing Concepts wif emphasis on ETL and System Development Life Cycle (SDLC).
- Hands on experience working on Waterfall Model as well as Agile Model, implementation of various sprint.
- Interacted wif end - users and functional analysts to identify and develop Business Requirement Documents (BRD) and transform it into technical requirements.
- Strong experience wif Informatica mapping i.e. Source Analyzer, Warehouse designer, Mapping designer, Mapplet Designer, Transformations Developer, Informatica Repository.
- Designed and developed complex mappings to move data from multiple sources into a common target area such as Data Marts and Data Warehouse using lookups, Source Qualifier, Router, Filter, Expression, Aggregator, Joiner, Normalizer, Sequence Generator, Update Strategy, and Stored Procedure from varied transformation logics in Informatica.
- Worked wif Teradata various utilities like Fast Load and Multi Load and Tpump and Teradata Parallel transporter and highly experienced in Teradata SQL Programming.
- Experienced in Teradata Parallel Transporter.
- Expertise in tuning teh performance of Mappings and sessions in Informatica and determining teh Performance bottlenecks.
- Having strong hands on experience in extraction of teh data from various source systems ranging from Mainframes like DB2, Flat Files, VSAM files, etc. to RDBMS like Oracle, SQLServer, Teradata etc.
- Extensively used multiple Slowly Changing Dimension (SCD-Type1,2,3) technique in ETL Transformation.
- Expertise in OLTP/OLAP System Study, Analysis, E-R diagram, developing Dimensional Models using Star schema and Snowflake schema techniques used in relational, dimensional and multidimensional modeling.
- Worked on optimizing teh mappings by creating re-usable transformations and Mapplets. Created debugging and performance tuning of sources, targets, mappings, transformations and sessions.
- Experienced in writing SQL, PL/SQL programming, Stored Procedures, Package, Functions, Triggers, Views, Materialized Views.
- Experience in Task Automation using UNIX Scripts, Job scheduling and Communicating wif Server using PMCMD command. Extensively used Autosys for Job monitoring and scheduling. Automated teh ETL process using UNIX Shell scripting.
- Proficient in converting logical data models to physical database designs in Data Warehouse Environment and in-depth understanding in Database Hierarchy, Data Integrity concepts and Data Analysis.
- Experience in defining standards, methodologies and performing technical design reviews.
- Excellent communication skills, interpersonal skills, self-motivated, quick learner and outstanding team player.
TECHNICAL SKILLS
ETL Tools: Informatica Power Center(10.2/9.x/8.x), Power Exchange 9x, Informatica Data Quality 9.x, power connect for IBM MQ series, power connect for Mainframes
BI Tools: Cognos 8/9, Tableau 10
Data Modeling: ERwin 9.5.2/7.3/4.1, MS - Visio
Databases: Teradata 15/14, Oracle 12c/11g/10g/9i, SQL Server 2008, DB2, MySQL, PostgreSQL
Languages: XML, Java, HTML, JAVA, PL/SQL C++, C, UNIX Shell Scripting, SQL, PL/SQL.
Big Data: Hadoop Ecosystem (HDFS, Hive, Pig)
OS: MS-DOS, HP UNIX, Windows and Sun OS.
Methodologies: Ralph Kimball’s Star Schema and Snowflake Schema.
Others: MS Word, MS Access, T-SQL, TOAD, SQL Developer, Microsoft Office, Teradata View Point, Teradata SQL Assistant, Ice scrum, Rally, JIRA,, Control - M, Autosys, GitHub
PROFESSIONAL EXPERIENCE
Confidential, Minneapolis, MN
Sr. Informatica Engineer / Data Analyst
Responsibilities:
- Actively involved in interacting wif business users to record user requirements and Business Analysis.
- Translated requirements into business rules& made recommendations for innovative IT solutions.
- Outlined teh complete process flow and documented teh data conversion, integration and load mechanisms to verify specifications for dis data migration project.
- Parsing high-level design spec to simple ETL coding and mapping standards.
- Worked wif PowerCenter Designer tools in developing mappings and Mapplets to extract and load teh data from flat files and Oracle database.
- Maintained warehouse metadata, naming standards and warehouse standards for future application development.
- Created teh design and technical specifications for teh ETL process of teh project.
- UsedInformaticaas an ETL tool to create source/target definitions, mappings and sessions to extract, transform and load data into staging tables from various sources.
- Responsible for mapping and transforming existing feeds into teh new data structures and standards utilizing Router, Lookups Using Connected, Unconnected, Expression, Aggregator, Update strategy & stored procedure transformation.
- Worked onInformaticaPowerCenter tool - Source Analyzer, Data Warehousing Designer, Mapping Designer & Mapplets, and Transformations.
- Prepared Unit/ Systems Test Plan and teh test cases for teh developed mappings.
- Worked wif slowly changing dimension Type1, Type2 and Type3.
- Maintained Development, Test and Production Mappings, migration using Repository Manager. Involved in enhancements and Maintenance activities of teh data warehouse.
- Performance tuning of teh process Confidential teh mapping level, session level, source level, and teh target level.
- Utilized Informatica IDQ to complete teh initial data profiling and matching/removing duplicate data for teh process of data migration from teh legacy systems to teh target Oracle Database.
- Designed and developed Informatica DQ Jobs, Mapplets using different transformation like Address validator, matching, consolidation, rules etc. for data loads and data cleansing.
- Extensively used Informatica Data Quality tool (IDQ Developer) to create rule-based data validations for profiling.
- Created dictionary tables using IDQ analyst tool for data validations.
- Heavily used BTEQ script for loading data into target table
- Worked on loading of data from several flat files to Staging using Teradata MLOAD, FLOAD and BTEQ.
- Implemented various new components like increasing teh DTM Buffer Size, Database Estimation, Incremental Loading, Incremental aggregation, Validation Techniques, and load efficiency.
- Strong on Exception Handling Mappings for Data Quality, Data Cleansing and Data Validation.
- Created Workflows containing command, email, session, decision and a wide variety of tasks.
- Developed Parameter files for passing values to teh mappings for each type of client
- Scheduled batch and sessions wifin Informatica using Informatica scheduler and wrote shell scripts for job scheduling.
- Customize shell scripts to run mapping in Control M.
- Understanding teh entire functionality and major algorithms of teh project and adhering to teh company testing process.
Environment: Informatica PowerCenter 10.2, IDQ 10.2, Oracle 11g, DB2, Teradata 15,MSSQL Server 2012, Erwin 9.2,Putty, Shell Scripting, Clearcase, Putty, WinSCP, Notepad++, JIRA, Control-M, Cognos 10.
Confidential, Yonkers, NY
Sr. Informatica Developer
Responsibilities:
- Requirement gathering, Business Analysis and documentation Functional, Technical, Integration Documents, low level and high-level design documents.
- Involved in Requirement analysis in support of Data Warehousing efforts.
- Write SQL queries against Snowflake.
- Worked on Cloud framework to land files in S3 tan extract and load into databases using IICS.
- Extensively worked wif various Active transformations like Filter, Sorter, Aggregator, Router, SQL, Union, Web-Services, Lookup, Joiner transformations of IICS
- Worked on Migration from PowerCenter 9.6.1 to PowerCenter 10.2 version.
- Worked on moving teh data using Informatica BDE to teh data lake using big data concepts.
- Worked wif source databases like Oracle, SQL Server, Teradata and Flat Files.
- Extensively worked wif Teradata utilities BTEQ, F-Load, M-load & TPT to load data in to ware house.
- Work independently on teh data migration tasks, backup and restore teh database, data comparison between databases, schema comparison, executing teh migration scripts etc.
- Build human task workflow scenarios and IDQ scorecards to support data remediation
- Read and understand teh API documentations.
- Involved in Unit testing, Smoke test, Functional testing, user acceptance testing and system integration testing.
- Created test plan, test data and executed test plan for business approvals.
- Created complex mappings using Unconnected and Connected Lookup Transformations using different caches.
- Involved in migration projects to migrate data from data warehouses on Oracle/DB2 and migrated those to Teradata.
- Experience working wif Member, pharmacy, patient, provider, encounters and claim data
- Experience handling company standard EDI files like 835, 837, HL-7, NCPDP files.
- Experience working wif parsing for structured and unstructured files using Informatica PowerCenter.
- Implemented Slowly changing dimension Type 1 and Type 2 for change data capture.
- UsedInformaticaPush down Optimization (PDO) to push teh transformation processing from teh PowerCenter engine into teh relational database to improve performance.
- Experienced in Teradata Parallel Transporter (TPT). Used fullPDOon Teradata and worked wif different Teradata load operators.
- Worked wif various look up cache like Dynamic Cache, Static Cache, Persistent Cache, Re Cache from database and Shared Cache.
- Worked on loading data into Hive and Impala system for data lake implementation.
- Designed solution to process file handling/movement, file archival, file validation, file processing and file error notification steps using Informatica.
- Experience wif Snowflake cloud data warehouse and AWS S3 bucket for integrating data from multiple source system.
- Worked extensively wif update strategy transformation for implementing inserts and updates.
- As per business we implemented Auditing and Balancing on teh transactional sources so dat every record read is either captured in teh maintenance tables or wrote to Target tables.
- Allows client tools to store reports or data models in teh Pentaho Repository.
- Implement data quality processes including transliteration, parsing, analysis, standardization and enrichment Confidential point of entry and batch modes; Deploy mappings dat will run in a scheduled, batch, or real-time environment.
- Implemented Informatica push down optimization for utilizing teh data base resources for better performance.
- Extensively used teh tasks like email task to deliver teh generated reports to teh mailboxes and command tasks to write post session and pre-session commands.
Environment: InformaticaPowerCenter 10.2/10.1,Informatica BDE 9.x,Oracle 12c, DB2, Sql Server2012, Hive, IICS, Teradata 15, Hadoop, Hive,AWS, Linux, Toad, IDQ, Pentaho, Snowflake, UNIX.
Confidential, Dallas, TX
Sr. Informatica Engineer / Data Analyst
Responsibilities:
- Analyze business requirements, technical specification, source repositories and physical data models for ETL mapping and process flow.
- Worked extensively wif mappings using expressions, aggregators, filters, lookup, joiners, update strategy and stored procedure transformations.
- Extensively used Pre-SQL and Post-SQL scripts for loading teh data into teh targets according to teh requirement.
- Developed mapping to load Fact and Dimension tables for Type 1, Type 2 dimensions, Incremental loading and Unit tested teh mappings.
- Extracted data from a Web Service source, transform data using a web service, and load data into a web service target.
- Coordinate and develop all documents related to ETL design and development.
- Involved in designing teh Data Mart models wif Erwin using Star schema methodology.
- Used repository manager to create repository, user’s groups and managed users by setting up privileges and profile.
- Used debugger to debug teh mapping and correct them.
- Performed Database tasks such as creating database objects (tables, views, procedures, functions).
- Responsible for debugging and performance tuning of targets, sources, mappings and sessions.
- Optimized teh mappings and implementing teh complex business rules by creating re-usable transformations and Mapplets.
- Involved in writing BTEQ, MLOAD and TPUMP scripts to load teh data into Teradata tables.
- Optimized teh source queries in order to control teh temp space and added delay intervals depending upon teh business requirement for performance
- Used Informatica workflow manager for creating, running teh Batches and Sessions and scheduling them to run Confidential specified time.
- Executed sessions, sequential and concurrent batches for proper execution of mappings and set up email delivery after execution.
- Preparing Functional Specifications, System Architecture/Design, Implementation Strategy, Test Plan & Test Cases.
- Implemented and documented all teh best practices used for teh data warehouse.
- Improving teh performance of teh ETL by indexing and caching.
- Created Workflows, tasks, database connections, FTP connections using workflow manager.
- Responsible for identifying bugs in existing mappings by analyzing data flow, evaluating transformations and fixing bugs.
- Code walks through wif team members.
- Developed stored procedures using PL/SQL and driving scripts using Unix Shell Scripts.
- Created UNIX shell scripting for automation of ETL processes.
- Used UNIX for check in’s and check outs of workflows and config files in to teh Clearcase.
- Automated ETL workflows using Control-M Scheduler.
Environment: Informatica PowerCenter 9.1.1, IDQ 9.1.1, Oracle 11g, Teradata 14.0, Teradata SQL Assistant, MSSQL Server 2012, MySQL, Erwin 9.2, Putty, Shell Scripting, Bit-Bucket, WinSCP, Notepad++, Rally, Control-M, Tableau 9.2.
Confidential, New York City, NY
Informatica Developer
Responsibilities:
- Involved in gathering business requirements, logical modeling, physical database design, data sourcing and data transformation, data loading, SQL and performance tuning.
- Participated in data analysis, data profiling, data dictionary and metadata management. Used SQL to do teh Data Profiling.
- Collaborated wif Business users to collect requirements and prepared ETL technical specifications
- Developed, supported and maintained teh ETL processes for exporting data from other application into reporting data mart using Informatica Power center 8.5.1
- Designed, built and maintained mappings, sessions and workflows for teh target data load process using Informatica, PL/SQL and UNIX.
- Implemented customer history data capture for catalogue tables using SCD Type 2.
- Designed mappings for Slowly Changing Dimensions, used Lookup (connected and unconnected), Update strategy and filter transformations for loading historical data.
- Developed Stored Procedures and used them in Stored Procedure transformation for data processing and have used data migration tools
- Worked wifSQL*Loaderto load data from flat files obtained from various facilities.
- Experience in debugging execution errors using Data Services logs (trace, statistics and error) and by examining teh target data.
- Worked extensively wif Informatica tools such as Source Analyzer, Warehouse Builder and Workflow Manager.
- Extensively used transformations like router, aggregator, lookup, source qualifier, joiner, expression and sequence generator transformations in extracting data in compliance wif teh business logic developed.
- Wrote SQL overrides in source qualifier to filter data according to business requirements.
- Wrote Unix shell scripts for scheduling Informatica pre/post session operations.
- Created different parameter files and started sessions using these parameter files using pmcmd command to change session parameters, mapping parameters, and variables Confidential runtime.
- Tuned teh mappings by removing teh Source/Target bottlenecks and Expressions to improve teh throughput of teh data loads.
Environment: Informatica PowerCenter8.5.1, IDQ 8.5.1, Oracle 10g, Toad, SQL Developer, UNIX, BOXI, PL/SQL, Autosys, Putty, WinSCP.