Sr.etl Informatica Developer Resume
San Francisco, CA
SUMMARY:
- Around 9 years of IT experience in development and Implementation of Data warehouse systems and Expertise in Informatica and Oracle, Assurance for ETL, Backend, Web based, Client/Server applications.
- Experience using query tools for Oracle, Teradata, Sybase, DB2 and MS SQL Server to validate reports and troubleshoot data quality issues.
- Experience in Pharmaceutical, Financial, Banking, Brokerage, and Securities industries
- Extensive ETL experience in Informatica 8.6.1/8.1/7.1/6.2/5.1 (Power Center/ Power Mart) (Designer, Workflow Manager, Workflow Monitor and Server Manager)
- Involved on Data analysis using Query analyzer, Understand the requirement and do the mapping layout document to implement in Informatica code.
- Expertise in OLTP/OLAP System Study, developing Database Schemas (Star schema and Snowflake schema) used in relational and dimensional modeling.
- Extensive knowledge and experience in producing tables, reports, graphs and listings using various SAS procedures and handling large databases to perform complex data manipulations.
- Experience in testing and writing SQL and PL/SQL statements.
- Experience in testing Business Intelligence reports generated by various BI Tools like Cognos and Business Objects
- Extensive experience in testing and implementing Extraction, Transformation and Loading of data from multiple sources into house using Informatica.
- Proficient experience in different Databases like Oracle, SQL Server, DB2 and Teradata.
- Expertise in Developing PL/SQL Packages, Stored Procedures/Functions, triggers.
- Experience in Data Analysis, Data Validation, Data Cleansing, Data Verification and identifying data mismatch.
- Expertise in querying and testing RDBMS such as Oracle, MS SQL Sever using SQL, PL/SQL for data integrity.
- Proficient in Oracle 10g/9i/8i/7.3, PL/SQL Development Toad, PL/SQL Developer, SQL Navigator, Perl, UNIX, Korn Shell Scripting.
- Has expertise in Test Case Design, Test Tool Usage, Test Execution, and Defect Management.
- Experience in UNIX shell scripting and configuring cron - jobs for Informatica sessions scheduling.
- Worked on issues with migration from development to testing for different data loading jobs
- Expertise in working complex Business rules by creating mapping and various transformations
- Experience in testing XML files and validating the data loaded to staging tables.
- Strong working experience on DSS (Decision Support Systems) applications, and Extraction, Transformation and Load (ETL) of data from Legacy systems using Informatica.
- Good experience in Cognos & Business Objects Reports testing
- Experience in Performance Tuning of SQL and Stored Procedures.
- Automated and scheduled the Informatica jobs using UNIX Shell Scripting.
- Experience in all phases of Software Development life cycle.
- Application Data warehousing experience in Banking, Pharmaceutical and Insurance.
- Performed Unit, Integration and System Testing.
- Good analytical and problem solving skills.
- Team player with excellent Organization and Interpersonal skills and ability to work independently and deliver on time.
- Expert knowledge in UNIX shell scripting (Korn shell/Bourne shell)
- Team oriented and ability to understand and adapt to new technologies and environments faster. Good at analysis and troubleshooting.
- Excellent analytical, programming, written and verbal communication skills.
TECHNICAL SKILLS:
DataWarehousing: Informatica Power Center 10x/9x/8x, IDQ
Data Analysis: Requirement Analysis, Business Analysis, detail design, data flow diagrams, data definition table, Business Rules, data modeling, Data Warehousing, system integration.
Data Modeling: Dimensional Data Modeling (Star Schema, Snow-Flake, FACT-Dimensions), Conceptual, Physical and Logical Data Modeling, ER Models, OLAP, OLTP concepts.
Databases: Oracle 12C/11g/10g/9i/8i, Teradata 15/14/12, MY SQL, DB2, SQL Server 2012/2008, Sybase.
Reporting: Cognos,SSRS, Dash Board.
Programming: SQL, PL/SQL, Unix Shell Scripting, Java, XML.
Environment: Windows, Unix, Linux.
Other: Agile&Agile(Kanban) & waterfall methodology, MS Office Tools, Visio.
WORK EXPERIENCE:
Confidential, San Francisco, CA
Sr.ETL Informatica developer
Responsibilities:
- Extensively worked on Informatica IDE/IDQ.
- Involved in massive data profiling using IDQ (Analyst Tool) prior to data staging.
- Used IDQ’s standardized plans for addresses and names clean ups.
- Worked on IDQ file configuration at user’s machines and resolved the issues.
- Used IDQ to complete initial data profiling and removing duplicate data.
- And extensively worked on IDQ admin tasks and worked as both IDQ Admin and IDQ developer.
- Performed many multiple tasks effectively and involved in troubleshooting the issues.
- Created s tables, applications and workflows and deployed that to Data integration service for further execution of workflow.
- Played a very important role in team in development of mappings, workflows and troubleshooting the issues.
- Working extensively on Informatica Designer, workflow manager to create, sessions, workflows and monitor the results and validate them according to the requirement.
- Design data and data quality rules using IDQ and involved in cleaning the data using IDQ in Informatica Data Quality 9.1 environment
- Performed extensive backend testing by writing Complex SQL queries.
- Develop MDM solutions for workflows, de-duplication, validations etc., and Facilitate Data load and syndication
- Worked on deploying metadata manager like extensive metadata connectors for data integration visibility, advanced search and browse of metadata catalog, data lineage and visibility into data objects, rules, transformations and data.
- Used IDQ to profile the project source data, define or confirm the definition of the metadata, cleanse and accuracy check the project data, check for duplicate or redundant records, and provide information on how to proceed with ETL processes.
Environment: Oracle 11g, PL/SQL, Informatica Data Quality 9.x, IDQ admin console, UNIX.PL/SQL and SQL
Confidential, Minneapolis, MN
Sr. ETL Informatica developer
Responsibilities:
- Involved in business requirement analysis and design, prepared functional and technical design document
- Involved in creating logical, physical data models for cleansing model
- Involved in data profiling, data quality and data cleansing and metadata management
- Extracted data from Salesforce.com(SFDC),Oracle, excel, data files using Informatica ETL mappings/SQL PLSQL scripts
- Performing backend testing of the DB by writing PL/SQL queries to test the integrity of the application and Oracle databases using TOAD.
- Created complex Informatica mappings using Unconnected Lookup, joiner, Rank, Source Qualifier, Sorter, Aggregator, newly changed dynamic Lookup and Router transformations to extract, transform and load data to Cleansing area
- Wrote complex PLSQL functions/procedures/packages
- Developed Informatica workflows/worklets/sessions associated with the mappings using Workflow Manager
- Wrote Test cases and executed test scripts in Cleansing area
- Involved in performance tuning of Informatica code and PLSQL scripts
- Supported during QA/UAT/PROD deployments
- Wrote UNIX Korn shell scripts for file transfer/archiving.
- Involved in Reviews as per ETL/Informatica standards and best practices
- Created Autosys JIL files and executed to generate Autosys Jobs.
- Scheduling of Informatica workflows/sessions using Autosys.
Environment: Informatica 9.1(Repository Manager, Admin Console, Designer, Workflow Manager, Workflow Monitor), Toad, Teradata, Oracle, UNIX, Autosy, PL/SQL.
Confidential
ETL developer
Responsibilities:
- Analyze and design components based on defined specifications. Convert Functional Requirements to Technical Design Documents
- Understand Source Data Systems. Develop, modify, test ETL components based on defined specifications and deploy these components
- Creating Unit Test cases templates and working with the BA team on all the test cases.
- Error Validation and quality check of the deliverables
- Working on defects, enhancements and new requirements
- Code Review documents and templates
- Performance Tuning in Informatica and Oracle PL/SQL objects
- Scheduling the loads for the ETL into the DW involved Control - M scheduling tool
- Adhering to best practices to improve the quality of the deliverables
- Handling job failures in Informatica, database and reporting issues
- Conceptualize and implement various Enhancements and Continuous Improvements involving Informatica tool and Oracle Database and Control M development
- Worked on creating various end to end reporting feeds which included designing and development using technologies such as Informatica, Oracle Database and Hyperion V8 and Control M.
Environment: Informatica, ETL, PL/SQL, Oracle, SQL, UAT, UNIX, Shell scripting
Confidential, Richardson, TX
ETL Informatica Developer
Responsibilities:
- Worked on the requirements with Business Analyst and business users also involved in working with datamodelers.
- Worked closely with data population developers, multiple business units and data solutions engineer to identify key information for implementing the Data warehouses.
- Analyzed logical and physical data models for Business users to determine common data definitions and establish referential integrity of the system.
- Parsed high-level design spec to simple ETL coding and mapping standards.
- Used Informatica power center as an ETL tool to create source/target definitions, mappings and sessions to extract, transform and load data into staging tables from various sources.
- Imported mapplets and mappings from Informatica developer (IDQ) to Power Center.
- Written Teradata BTEQs & as well Informatica mappings using TPT to load data from Staging to base.
- Fine-tuned Teradata BTEQs as necessary using explain plan and collecting statistics
- Extensive exposure to data extraction, conversion loading from various sources including flat files, Oracle, SQL Server, DB2, CCR and CCD.
- Created and used the Normalizer Transformation to normalize the flat files in the source data.
- Extensively built mappings with SCD1, SCD2 implementations as per requirement of the project to load Dimension and Fact tables.
- Used Evaluate expression options to validate and fix the code using Debugger tool while testing Informatica code.
- Handled initial (i.e. history) and incremental loads in to target database using mapping variables.
- Worked with Workflow Manager for maintaining Sessions by performing tasks such as monitoring, editing, scheduling, copying, aborting, and deleting.
- Worked on performance tuning at both the Informatica level and Database as well by finding the bottlenecks.
- Developed UNIX shell scripts to run the pmcmd functionality to start and stop sessions, batches and scheduling workflows.
- Performed Unit testing and created unit test plan of the code developed and involved in System testing and Integration testing as well. Coordinated with the testers and helped in the process of integration testing.
- Heavily involved in Production support on rotational basis and supported DWH system using the ticketing tool for the issues raised by Business users.
- Experience in working with reporting team in building collection layer for reporting purpose
Environment: Informatica Power Center 9.5.1, Oracle, SQLServer2008/2012, Facets, Oracle Sql Developer, Tidal Scheduler, Windows.
Confidential, Charlotte, NC
ETL Developer
Responsibilities:
- Responsible for requirement gathering from various groups. Followed Iterative Waterfall model for Software Development Life Cycle Process (SDLC).
- Designed and developed Informatica mapping codes to build business rules to load data. Extensively worked on Informatica Lookup, stored procedure and update transformations to implement complex rules and business logic.
- Analyzed and created facts and dimension tables.
- Analyzed business requirements and worked closely with the various application teams and business teams to develop ETL procedures that are consistent across all applications and systems.
- Developed the functionality at the database level PL/SQL using Toad tool as well at the Unix OS level using shell scripting.
- Experience with dimensional modelling using star schema and snowflake models.
- Experienced in identifying and documenting data integration issues, challenges such as duplicate data, nonconfirmed data, and unclean data
- Imported Source/Target Tables from the respective databases and created Reusable Transformations (Joiner, Routers, Lookups, Rank, Filter, Expression and Aggregator), Mapplets and Mappings using Designer module of Informatica.
- Used Informatica power center for (ETL) extraction, transformation and loading data from heterogeneous source systems.
- Participated in PL/SQL and Shell scripts for scheduling periodic load processes.
- Worked extensively on SQL coding to check the data quality coming from the respective parties.
- Worked cooperatively with the team members to identify and resolve various issues relating toInformatica and databases.
- Applied performance tuning techniques for cubes to reduce calculate time and partitioned cubes.
Environment: Informatica Power Center 8.1, Oracle 9i, SQL,SQL server 2008 SQL Server Management Studio, Bitbucket, PL/SQL, SQL*Plus, Windows XP, Unix Shell Scripting.