Sr Informatica Developer Resume
Milwaukee, WI
PROFESSIONAL SUMMARY:
- Around 9yrs of IT experience which includes strong data warehousing, ETL and OLAP experience using Informatica Power mart and Power Center10.1/9.6.0/ 8.6/ 8.1/ 7x/ 6.2/6.1/ 5.1,(Source Analyzer, Data Warehousing Designer, Mapping Designer, Mapplet, Transformations), Business Objects, Oracle 11g/10g/9i/8i/7.x, MS SQL Server 2014/2008/2005/2000/7.0/6.5, MS Access 2007/2005.
- Experience in all the phases of Data warehouse lifecycle involving Requirement Analysis, Design, Coding, and Deployment.
- Experience in working wif business analysts to identify and understand requirements and translated them into ETL code in Requirement Analysis phase.
- Experience in creating High Level Design and Detailed Design in the Design phase.
- Extensively worked on the ETL mappings, analysis and documentation of OLAP reports requirements. Solid understanding of OLAP concepts and challenges, especially wif large data sets.
- Experience in Repository Configuration and processing tasks using Workflow Manager & Workflow Monitor to move data from multiple sources into targets.
- Strong knowledge of Entity - Relationship concept, Facts and dimensions tables, slowly changing dimensions and Dimensional Modeling (Star Schema and Snowflake Schema
- Experience in loading the ERP Sources/Targets data like SAP R3 & BIW using Power Connect.
- Involved in full life cycle development of Data Warehousing.
- Experience in integration of various data sources like Oracle, DB2, Sybase, SQL server and MS access and non-relational sources like flat files into staging area.
- Implemented Slowly Changing Dimensions (SCD) Type 1 and Type 2 for initial and history load using Informatica.
- Expert in Performance tuning, troubleshooting, Indexing and partitioning techniques on Sources, Targets, Mappings and Workflows in Informatica.
- Expertise in Teradata utilities like Multi Load, Fast Load, Fast Export, BTEQ, TPUMP, TPT and tools like SQL Assistant, Viewpoint.
- Well-versed in tuning Teradata ETL queries, remediating excessive statistics, resolving spool space issues, applying compression for space reclamation etc.
- Expert in writing optimized SQL query using ORACLE, SQL Server, Teradata, MySQL and Hive.
- Exposure in writing SQL using analytical functions like Ranking Functions, Reporting Aggregate Functions, LAG/LEAD Functions, FIRST/LAST Functions etc.
- Experience in creating Reusable Transformations (Joiner, Sorter, Expression, Lookup, Router, Filter, Update Strategy, Sequence Generator, Aggregator, and Rank) and Mappings using Informatica Designer and processing tasks using Workflow Manager to move data from multiple sources into targets.
- Experience in using Exception handling strategies to capture errors and referential integrity constraints of records during loading processes to notify the exception records to the source team.
TECHNICAL SKILLS:
Operating Systems: WIN NT/2000/XP/VISTA, UNIX (Sun Solaris, AIX, HP, LINUX
Application Software: MS-Office.
Tools & Utilities: PL/SQL Developer, TOAD, SQL Developer, Editplus,SQL* Plus, SVN.
Languages: SQL, PL/SQL, C, C++, Java, Pro* C.
RDBMS: Oracle 11g/10g/9i/8i, SQL Server 2005, Teradata 13/12v.
SCRIPTING Languages: HTML, Shell, Perl, Python, and Javascript, XML, XSL.
ETL: Datastage, Informatica Powercenter 9x/8.6.1/8.1/7.1/6.2., SSIS.
Data Modeling: Ralph Kimball Methodology, Bill Inman Methodology, Star, Snowflake, Fact Tables, Dimension Tables, Physical and Logical Modeling, Normalization and Denormalization.
Other tools: Putty, Toad, PL/SQL developer andWinSCP, Crystal Reports.
OLAP: Hyperion Essbase and Planning.
PROFESSIONAL EXPERIENCE:
Confidential, Milwaukee, WI
Sr Informatica Developer
Responsibilities:
- Involved in the full SDLC of the project including Design, Development and Testing.
- Gatheird requirements from Business Analysts, End Users analyze it and developed Technical design documents (TDD) using Business requirement documents (BRD
- Performing one time history loads in different layers for all Business Units.
- Involved in extensive performance tuning by determining bottlenecks at various points like
- sources, targets, mappings and sessions.
- Analyzing the session logs for troubleshooting the workflows/sessions failures and for the long
- running sessions.
- Worked wif SQL Server 2014 to run SQL statements for storing, retrieving, and manipulating data in a relational database.
- Used the SQL Server Database tuning Advisor to fine tune the Source query to improve performance.
- Copying the workflows in different layers and updating the parameter files for all Business Units.
- Created and Configured Workflows, Worklets, and Sessions to load the data to target tables using
- Informatica Workflow Manager.
- Modifying the commit interval, buffer block size and DTM buffer settings in the session
- properties for improving performance for long running sessions.
- Added new worklets wif different sessions and partitioned the sessions of the workflows for
- loading quarterly data to improve the session performance.
- Exported and imported workflow xml files to standardize the session properties for large number of sessions in the workflow.
- Enhanced the existing mappings and created sessions and worklets to apply the business logic as
- per the requirements.
- Extracted data from flat files and SQL Server tables sources and loaded into the target database.
- Created missing indexes and dropped unused indexes on the source tables for faster retrieval of data.
- Migrated workflows from UAT to PROD repositories and validate them.
- Scheduled workflows using Autosys and Tivoli Scheduler.
- Performed incremental loads post history data loads across all the Business Units.
- Modified the existing stored procedures and replaced Pre SQL query in the session properties to improve performance.
- Created jil files wif command and box jobs to schedule the incremental catch up workflows.
- Worked wif QA team for defect fixes across UAT and PROD environments.
- Performed ETL testing to validate the data after loading.
Environment: Informatica Powercenter 10.1, SQL Server 2014, Unix Shell Scripting, Redhat Linux, HP Service Manager, Subversion, Tivoli, Autosys.
Confidential, Albany, NY
MDM Developer
Responsibilities:
- Expertise in Address Doctor Integration wif Power Center, Developer, MDM.
- Address Doctor Upgrade in Power Center and Developer.
- Configure AVOS wif hub and made sure dat default workflows are integrated.
- Experience configuring BE workflows wif IDD.
- Installed MDM in the windows development environment.
- Expertise in handling Address Cleansing before populating the data into landing tables.
- Worked on ETL process for bringing in the data into MDM landing tables.
- Worked on profiling the data using Developer tool/Analyst Tool for identifying the data integrity from different sources.
- Most of the data is cleansed in Power Center/Developer before the data is placed in MDM landing tables.
- Based on the data quality analysis and discussion wif stakeholders the source data trusted scores are defined.
- Validation rules are developed based on the profiled data quality and data analysis.
- Came to conclusion on key fields after discussing wif people in the knowledge of data.
- Defined Match rules in Match and Merge settings of the base tables by creating Match Path Components, Match Columns and Rule sets.
- Configured match rule set filters for meeting the different data scenarios.
- Performed match/merge and ran match rules to check the TEMPeffectiveness of MDM on data and fine-tuned the match rules.
- Developed ad hoc queries like Execute Batch Delete SOAP requests for deleting the specific data from the concerned underlying tables using.
- Developed Unmerge user exist for reprocessing the some of the records which are supposed to be processed differently.
- Closely worked wif Data Steward Team for designing, documenting and configuring Informatica Data Director.
- Used ActiveVOS for configuring workflows like One step approval, merge and unmerge tasks.
- Configured static lookups, dynamic lookups, bulk uploads, extended search and Smart search in IDD.
- Configured JMS Message Queues and appropriate triggers for passing the data to the contributing systems.
Environment: Multi-Domain MDM 10.0, IDD, Oracle 11g, Oracle PL/SQL, Windows Application Server, Active VOS, Informatica Powercenter 10.1, Informatica Developer, Address Doctor 5.1, PowerShell.
Confidential, Bloomington, IL.
Sr Informatica Developer
Responsibilities:
- Developed Technical design documents (TDD) from user requirements and functional requirements documents (FRD
- Worked on the build phase for the new data center migration of TAC project.
- Involved in implementing change management process while fixing the existing code to enhance the code or bug fixing.
- Modified existing mappings to accommodate new business requirements due to single instance impact at the source.
- Worked on Data Conversion and Data Analysis.
- Troubleshooting the issues in Informatica.
- Supported AS400/iSeries applications, starting/stopping hub jobs, clearing queues and logjams.
- Created and Configured Workflows, Worklets, and Sessions to transport the data to target using Informatica Workflow Manager.
- Worked wif Oracle SQL for storing, retrieving, and managing document-oriented, or semi structured data.
- Supported .Net based applications such as configuring, clean restarting and managing.
- Involved in extensive performance tuning by determining bottlenecks at various points like targets, sources, mappings and sessions. dis led to better session performance.
- Worked on Agile methodology for quick responses to change and continuous development.
- Improve availability and timeliness of test environments.
- Involved in extracting data (CSV Files) from Mainframes and loaded into Oracle target tables using Powercenter workflows.
- Performed ETL configuration and troubleshooting.
- Worked wif Teradata SQL statements for storing, retrieving, and manipulating data in a relational database.
- Worked wif Powerexchange for mainframe to simplifies mergers, acquisitions, and reorganizations to integrate, migrate the mainframe data rapidly.
- Strong Knowledge of Oracle EBS/ERP
- Worked wif Security Analyst to upload the Unix script into Dynamic Password Lookup tool to securely access the application credentials.
- Used ODS for integrating disparate data from multiple sources for business operations, analysis and reporting.
- Used Informatica Power Exchange for Mainframes to access, integrate and mainframe-based enterprise data to Oracle target database using Power Center.
- Worked wif Oracle ERP application for managing accounting, procurement, and projects throughout the organization.
- Involved in Agile process for continuous attention to technical excellence and good design.
- Worked wif different transformations in Informatica Power Center to apply business logic.
- Created mappings in Powercenter to migrate data from Flat files, Sybase to data warehouse.
- Strong technical skills wif Unix/Linux systems.
- Used DB2 Load utility to load the data from FDW table into the file.
- Managed ETL Domain/Repository/Integrations services and workflows (Stop/Start) whenever the database servers are bounced.
- Used Oracle ERP software wif integrated cloud applications to deliver the functionality, analytics, security, and collaboration tools for the business needs.
- Worked wif ODS can for data scrubbing, resolve for redundancy and checked for compliance wif the business rules.
- Worked wif DBA to create the derived tables required for ETL development.
- Used UNIX Shell Scripting on Linux platforms to automate the loading process and to generate restart tokens for cold starting the workflows.
- Created Sessions and Workflows to carry out the ETL process, configured sessions and connection parameters using Power Center Tool.
- Extensively used unix commands to check the processes, view logs and troubleshoot the issues.
- Used unix scripts to load the initial/incremental SQL into the file and then load into FDW table.
- Worked closely wif Unix SAs for troubleshooting OS related issues wif applications and servers maintenance activities.
- Involved in Data migration, extraction, Data cleansing and Data Staging of operational sources using ETL processes.
- Used control-M tool for automating and scheduling the ETL workflows.
- Worked wif different teams to troubleshoot the workflows/control-M jobs failures due to db and network issues.
- Developed Unit test cases and Unit test plans to check data is correctly loading.
Environment: Informatica Power Center 9.6.0, 10.1, IBM-AIX, Teradata, Oracle, Unix Shell Scripting, Red hat Linux, HP Service Manager, Power Exchange .Net, Oracle 12c, Subversion, Control-M.
Confidential, Palo Alto CA
ETL Designer
Responsibilities:
- Provide development work Pricing Analytics (PRAO) application.
- Analyze complex user requirements, procedures, and problems to improve existing system design.
- Design, develop, configure, program and implement software applications, packages and components customized to meet specific needs and requirements.
- Creation and Imported/Exported various Sources, Targets, and Transformations using Informatica Power Center 9.0.
- Creation of Informatica Mappings, Sessions and Workflows.
- Created reusable transformations and mapplets by using Lookup, Aggregator, Normalizer, Update strategy, Expression, Joiner, Rank, Router, Filter, and Sequence Generator etc. in the Transformation Developer, and Mapplet Designer, respectively.
- Extract data from Oracle & Flat files and load into SQL Server
- Work wif the data conversions, standardizing, correcting, error tracking and enhancing the data.
- Work wif the generation of XML using Informatica and load into PRAO.
- Creation of Database Tables and Indexes.
- Develop end to end application components involving business layer, persistence layer, and database and web services layer.
- Review and modify programs to ensure technical accuracy, security and reliability.
- Involved in running Hadoop jobs for processing data coming from different sources.
- Involved in development and design of a Hadoop cluster using Apache Hadoop for POC and data analysis.
- Importing and exporting data into HDFS and Hive to help BI Team for Reporting.
- Analyzed large data sets by running Hive queries and Shell scripts.
- Writing of Shell Scripts to automate the process end to end using TIDAL Scheduler.
Environment: Informatica Power Center 8.6, SQL Server 2012, Oracle 11g, HP - UX, SFDC, Vertica, Hadoop, Hive, Sqoop, Linux &Tidal
Confidential
ETL Informatica Developer
Responsibilities:
- Involved in analyzing the data models of legacy implementations, identifying the sources for various dimensions and facts for different data marts according to star schema design patterns.
- Developed complex mapping using Source qualifier, Aggregators, Connected & unconnected lookups, Filters & Update strategy.
- Extensively used Various Data Cleansing and Data Conversion Functions in various transformations.
- Used Debugger to validate transformations by creating breakpoints to analyze, and monitor Data flow.
- Tuned performance of Informatica Session by increasing block size, data cache size, sequence buffer length and Target based commit interval, and mappings by dropping and recreation of indexes.
- Worked along wif the QA Team and provided production support by monitoring the processes running daily.
- Defined Target Load Order Plan for loading data into Target Tables.
- Implemented Slowly Changing Dimensions methodology and developed mappings to keep track of historical data
- Written SQL overrides in Source Qualifier and Lookups according to business requirements.
- Involved in troubleshooting the loading failure cases, including database problems.
- Responsible for Documentation of the Test cases, completed modules and the results of acceptance testing.
Environment: Informatica Power Center 7.1.1, Oracle 9i, MS SQL Server 2005, UNIX, PL/SQL, UNIX shell scripting, SQL*PLUS, SQL, TOAD, Reports, MS Excel, MS Access, Flat Files, XML.