Informatica Consultant Resume
Oakland, CA
SUMMARY:
- 7+ years of overall experience in the Information Technology Industry.
- 6+ years of experience in working with data warehouses and using Informatica Powercenter 6.1, 7.1, 8.1, 8.6(Designer, Repository Manager, Workflow Manager and Workflow Monitor).
- 4+ years of data modeling experience using ERWIN.
- Extensive knowledge with dimensional data modeling, star schema/snowflakes schema, fact and dimension tables.
- Experience in building of operational data stores (ODS), data marts, and enterprise data warehouses.
- Expertise in configuration, performance tuning, installation of Informatica, & in integration of various data sources like Oracle, Teradata, MS SQL Server, IBM DB2, XML and Flat files.
- 6+ years of extensive experience with databases (MS SQL Server 2000/2005/2008, Oracle 8i/9i/10g, Teradata, and IBM DB2).
- Extensive work with PL/SQL, performance tuning of Oracle using Tkprof, SQL trace, SQL plan, SQL hints, Oracle partitioning, various indexes and join types.
- Understanding & working knowledge of Informatica CDC (Change Data Capture).
- Experienced in tuning Informatica mappings to identify and remove processing bottlenecks.
- Experienced with Informatica Data Explorer (IDE) / Informatica Data Quality (IDQ) tools for Data Analysis / Data Profiling and Data Governance.
- Hands on experience on Informatica Power Exchange for loading/retrieving data from mainframe systems.
- Good working knowledge and experience with Oracle Data Integrator (ODI).
- Managed key subject area of very large relational database in client server environment.
- Experience in working with COBOL files, XML, and Flat Files.
- 4+ years of Unix shell scripting. Hands on exposure on UNIX environment and experience in using third party scheduling tools like AutoSys.
- Experience with Teradata as the target for the datamarts.
- Written the scripts needed for the business specifications (BTEQ Scripts).
- Knowledge of Oracle Apps (Oracle Enterprise Business Intelligence).
- Proficient knowledge in ETL using SQL Server Integration Services (SSIS).
- Strong skills in data analysis, data requirement analysis and data mapping for ETL processes.
- Ability to prioritize and execute tasks in a high pressure environment.
- Experience in mentoring and providing knowledge transfer to team members, support teams and customers.
- Reliable, proactive, responsible, hardworking and good team player.
TECHNICAL SKILLS:
ETL Tools: Informatica Power Center 8.x/7.x/6.x, SSIS
BI Tools: Cognos BI Suit, SSRS, Business Objects 6.5
Databases: MS SQL Server 2005/2008, Oracle 8i/9i /10g, Teradata, IBM DB2, Sybase
Database Skills: Cursors, Stored Procedures, Functions, Views, Triggers, and Packages
Client Side Skills: SQL, T - SQL, PL/SQL, UNIX shell scripting, Java, HTML, XML, CSS, JavaScript, C, C++, VB 6.0
Web Servers: IIS v5.0/6.0, Apache Tomcat
OS: Windows 2000/2003/XP/Vista/Windows 7, UNIX/Solaris, Red Hat Linux, AIX
Version Control: Visual Source Safe 6.0, CVS
Methodologies: Data Modeling Logical / Physical, Star/ Snowflake Schema, FACT & Dimension Tables, ETL, OLAP, Agile, RUP, Software Development Life Cycle (SDLC)
Design/Process: UML
Tools: Toad, Erwin, Rational Rose, MS Project, JIRA, Test Director, Bugzilla, TestTrack Pro, MS Visio, Autosys, Turbo Data Generator, Advance Data Generator.
PROFESSIONAL EXPERIENCE:
Confidential, Oakland, CA
Informatica Consultant
Environment: Informatica PowerCenter 8.6/8.1, Oracle 10g/11g, SQL developer, Erwin, Flatfiles, TOAD,PL/SQL, MS SQL Server, Crystal Reports, Autosys, Cognos, UNIX.
Responsibilities:
- Worked closely with Manager and Database Administrator
- Analyzed functional specs and created technical documentation.
- Extensively used all the features of Informatica 8.6 including Designer, Workflow manager and Repository Manager, Workflow monitor.
- Responsible for developing, support and maintenance for the ETL (Extract, Transform and Load) processes using Informatica powercenter.
- Designed and developed several ETL scripts using Informatica, UNIX shell scripts
- Analyzed the source data coming from Oracle, Flat Files, and MS Excel coordinated with data warehouse team in developing Dimensional Model.
- Created ftp connections, database connections (ODBC) for the sources and targets.
- Backup/Restoration of repositories.
- Implemented various workflows using transformations such as JAVA transformation, XML transformation, Normalizer, look up, aggregator, stored procedure and scheduled jobs for different sessions.
- Implemented Slowly Changing dimension methodology for accessing the full history of accounts and transaction information.
- Scheduled Sessions and Batches on the Informatica Server using Informatica Server Manager.
- Experience with writing and executing test cases for data transformations in Informatica
- Scheduled mappings and workflows using Autosys
- Created medium to complex PL/SQL stored procedures for integration with Informatica using Oracle 10g.
- Upgrading and maintenance for data warehouse tools
- Co-ordinate with the off-shore teams and mentored junior developers.
- Analyzed performance tuning of Oracle 10g to improve its performance.
- Developed Shell Scripts to automate file manipulation and data loading procedures
- Proficient with Visual Source Safe for version control for Informatica objects, Oracle views and packages and test scripts.
- Developed UNIX shell scripts to control the process flow for Informatica workflows to handle high volume data.
- Prepared Test cases based on Functional Requirements Document.
- Developed Procedures and Functions in PL/SQL for Data ETL.
- Involved in Unit, System, Performance, Concurrent and QA testing.
- Worked with complex Cognos reports in Report Studio using master-detail relationship, drill through, drill up and drill down, burst options, and PROMPTS and also with LOV, aggregate aware.
- Created dashboards for a managerial overview with drill up and drill down capabilities
Confidential
Informatica Developer
Environment: Informatica Power Center 8.1, PL/SQL, Work Flow Manager, Work Flow Monitor, Oracle 9g/10g, Ultra Edit-32, Autosys, ROBO FTP, Unix scripting.
Responsibilities:
- Interacted with Users and Business Analysts to understand and document the Requirements and translated the Functional Specifications to Technical Specifications for Designing the Informatica Mappings.
- Involved in complete SDLC including Envisioning, Planning, Design, Development, User Acceptance Testing and Deployment.
- Developed source to target mapping specifications as per the business requirements.
- Extensively worked on Informatica Designer, Work Flow Manager, Work Flow Monitor and Repository Manager.
- Designed and developed Informatica mappings for data loads and data cleansing.
- Extracted and processed our EDI and Paper Claims in order to submit the Inpatient, Outpatient and Medical claims to the State as part of encounter process.
- Utilized and consulted with the client on "best practices" approach to Project Management and Change Management.
- Documented the project documents such as mappings specifications, test cases and process flow diagrams.
- Extensively used Unix Shell Scripting for the initial bulk history one time Loads.
- Developed process workflow to maintain the dependency between the jobs.
- Used Autosys to schedule the jobs in UNIX.
- Used application development tools such as Rapid SQL during unit testing.
- Extensively worked in the Performance tuning of the SQL and ETL processes.
- Attended Informatica user group meetings and participated in Performance tuning group discussion.
- Worked on UNIX shell scripts. Developed UNIX shell scripts to run the pmcmd functionality to start and stop sessions.
- Involved in Unit testing, User Acceptance Testing to check whether the data is loading into target, which was extracted from different source systems according to the user requirements.
Confidential
Environment: Informatica Power Center 8.6, PL/SQL, Rapid SQL 7.5.1/7.5.5, Work Flow Manager, Work Flow Monitor, Oracle 11g, Ultra Edit-32, Autosys, ROBO FTP, Unix scripting.
Responsibilities:
- Involved in design, development, testing phases of ETL.
- Extensively worked on Informatica Designer, Work Flow Manager, Work Flow Monitor and Repository Manager.
- Developed source to target mapping specifications as per the business requirements.
- Involved in design, development and testing phases of ETL.
- Involved in SQL tuning and troubleshooting existing production jobs on performance issues.
- Developed Test Scripts and Test Cases for unit testing.
- Involved in co-ordination with .net application teams for the premium billing reconciliation process.
- Developed process workflow to understand the dependency between the jobs and the order in which they were supposed to be scheduled.
- Coordinated between the onsite and offshore ETL teams as a lead offshore developer for meeting the scheduled project milestones and deadlines.
- Developed and modified UNIX shell scripts to meet the requirements after the system modifications and was also involved in monitoring and maintenance of the batch jobs.
- Involved in the ongoing maintenance of the Monthly batch jobs.
- Responsible for QA migration and Production Deployment.
Confidential, Fort Worth, TX
ETL Developer
Environment: Informatica Power Center 7.1/8.1, Informatica Data Quality, Informatica Data Explorer, Power Exchange, Oracle 9g, Toad, PL/SQL, SQL*Plus, DB2, MS SQL Server 2008, Erwin, Windows XP, Autosys, Cognos, Unix
Responsibilities:
- Requirement gathering, analysis and designing technical specifications for the data migration according to the business requirement.
- Developed logical and physical dimensional data models using ERWIN.
- Designed, developed and improved complex ETL structures to extract transform and load data from multiple data sources into data warehouse and other databases based on business requirements.
- Developed complex mappings and SCD Type-I, Type-II and Type III mappings in Informatica to load the data from various sources using different transformations like Source Qualifier, Lookup, Expression, Aggregate, Update Strategy, Sequence Generator, Joiner, Filter, Rank and Router, Stored Procedure, XML and SQL transformations.
- Responsible for testing, modifying, debugging, documenting and implementation of Informatica mappings.
- Performed metadata validation, reconciliation and appropriate error handling in ETL processes.
- Developed test cases and test plans to complete the unit testing. Support System testing.
- Planned and coordinated testing across multiple teams, tracked and reported status, created testing cycle plan, scenarios etc.
- Troubleshoot data issues, validated result sets, recommended and implemented process improvements.
- Responsible for performance tuning in Informatica PowerCenter at the Target Level, Source level, Mapping Level, Session Level, and System Level.
- Extensively worked with SQL queries. Created Cursors, functions, stored procedures, packages, Triggers, views, materialized views using PL/SQL Programming.
- Extensively worked with PL/SQL, performance tuning of Oracle
- Used ANALYZE, dbms stats, explain plan, sql trace, sql hints and Tkprof to tune sql queries.
- Extensively used Oracle partitioning (range/hash/list), indexes (bitmap, B-tree, reverse key, etc) and various join types (hash joins, sort merge, nested iteration join) to improve the performance.
- Used Oracle Data Integrator (ODI) for web-services integration
- Used Informatica Data Explorer (IDE) / Informatica Data Quality (IDQ) tools for Data Analysis / Data Profiling and Data Governance.
- Used Informatica Power Exchange for loading/retrieving data from mainframe system.
- Hands on experience with Informatica Metadata Manager
- Design/developed and managed PowerCenter upgrades from v7.x to v8.6, Migrate ETL code from Informatica v7.x to v8.6. Integrate and managed workload of Power Exchange CDC.
- Extensively worked with incremental loading using Parameter Files, Mapping Variables and Mapping Parameters.
- Developed user defined functions (UDF) to extract data from flat files
- Developed and modified UNIX shell scripts to meet the requirements after the system modifications and was also involved in monitoring and maintenance of the batch jobs.
Confidential, Los Angeles, CA
Informatica Developer
Environment: Informatica Power Center 7.1/8.1, Oracle 9i, Autosys, Erwin, CMS, MS PowerPoint, MS Visio, TOAD, PL/SQL, UNIX,SQL Loader*,SQL server
Responsibilities:
- Involved with requirement gathering and analysis for the data marts focusing on data analysis, data quality, data mapping between ODS, staging tables and data warehouses/data marts.
- Designed and developed processes to support data quality issues and detection and resolutions of error conditions.
- Working with the Business Analysts and the QA team for validation and verification of the development.
- Analyzed the session logs, bad files and error tables for troubleshooting mappings and sessions.
- Implemented various scenarios related to slowly growing targets and slowly changing dimensions(Type1, Type2, Type3).
- Implemented various business rules of data transformations using various Informatica transformations like Normalizer, Source Qualifier, Update Strategy, Look up(connected/unconnected/static cached/dynamic cached), Sequence Generator, expression, Aggregator, XML(source and generator), Stored Procedures.
- Worked with newer Informatica transformations like Java transformation, Transaction Control.
- Used Teradata as a Source and a Target for few mappings. Worked with Teradata loaders within Workflow manager to configure FastLoad and MultiLoad sessions.
- Provided administrative functions like creating repositories, backing up repositories, setting up users, assigning permissions and setting up folders in Repository manager.
- Worked on Oracle Warehouse Builder for already existing workflows to get the final source files for use.
- Wrote shell script utilities to detect error conditions in production loads and take corrective actions, wrote scripts to backup/restore repositories, backup/restore log files.
- Heavily involved with performance tuning of Oracle database - using TKProf utility, working with partitioned tables, implementing layer of materialized views to speed up lookup queries, working with Bitmap indexes for dimension tables, DBMS Stats package to update statistics, using SQL hints.
- ETL processes migrated Informatica objects from development to QA and production using deployment groups.
- Provided production support and involved with root cause analysis, bug fixing and promptly updating the business users on day-to-day production issues.
- Co-ordinated with the off-shore teams and mentored junior developers.
Confidential, Harrisburg, PA
ETL Developer
Environment: Informatica Power Center 7.1, Oracle 8i, SQL Server, DB2, MS Access 2000, SQL, PL/SQL, MS SQL Server 2000, Erwin, Windows NT.
Responsibilities:
- Mapped business requirements to source systems to decide what data elements will be extracted.
- Created Logical and Physical Modeling of Star and Snowflake schemas based on the business rules in Erwin
- Developed Logical model by iteratively refining it based on user feedback to match analysis and reporting requirements
- Worked extensively in the development of Extract, Transformation and Load routines
- Coordinated between the onsite and offshore ETL teams as a lead offshore developer for meeting the scheduled project milestones and deadlines.
- Collected, normalized and integrated source data from its operational databases, as well as from third-party hospital, laboratory and physician disease reports
- Developed transformations and mapping guidelines for loading different targets
Used Informatica Power Center to create maps and transform data
- Created mappings using various active and passive transformations like source qualifier, lookup, router, procedure, aggregator, filter, joiner, expression and reusable mappings in Informatica .
- Documented all the mappings and the transformations involved in ETL process
Unit and integration tested Informatica Sessions, Batches, Workflows and the Target data.
- Implemented optimization and performance tuning of mappings to achieve high response times
- Developed Shell Scripts to automate file manipulation and data loading procedures
Confidential
Database Developer
Environment: Oracle 8i, Shell Scripts, UML, Test Director, SunOS
Responsibilities:
- Actively involved in gathering requirements from Business Users, and converting them into system requirement specifications and creating detailed use-cases and design documents
- Designed, developed, and managed the workflow processes to reflect business requirements with several adapters, exceptions and rules
- Was involved in data modeling. Designed data flows using UML.
- Designed and developed User Group Management modules to implement complex business rules for permissions.
- Coordinated in setting up the development, test, production and contingency environment
- Designed, developed, managed database star schema, with various hierarchical and lookup tables
- Developed and maintained complex stored procedures
- Involved in setting up of application server clustered environment
- Underwent training in Standard Software Process in implementing CMM level 5 in an enterprise organization