Informatica Consultant Resume
Glen Mills, PA
SUMMARY
- Over eight years of experience in Application Development in Financial, Mortgage, Insurance, Sales and HealthCare industries. Highly skilled in designing/re - designing, developing and implementing ETL processes. Complete domain and development Life Cycle knowledge of Data Warehousing: comfortable at complete Life Cycle concepts and data model concepts of Data Warehousing. Expertise in ETL Tools Informatica and Ab Initio; Databases like SQL Server, Oracle and Teradata. Strong experience in Unix Shell Scripting, Tidal, Tivoli and Scheduler.
- Proficiency in understanding Relational and Dimensional data models.
- Proficient in logical data modeling (Star, Snow Flake schema) using ERwin and De-normalization techniques.
- Experienced in developing strategies for Extraction, Transformation and Loading (ETL) mechanism using Informatica and Ab Initio ETL tools.
- Experience in Integration of various data sources like Oracle, SQL Server, Flat Files, XML files, COBOL files.
- Maintain and enhance existing data extraction, cleansing, transformation, and loading processes to improve efficiency.
- Loading facts and dimensions from source to target data mart
- Involved in creating migration plans for moving the application from development, test, stage and then into production environment.
- Experience in creation and maintains new user accounts, permissions for different folders
- Experience in taking periodic backups and restores as required
- Using PowerCenter through command lines like stopping and restarting and run command level jobs
- Experience configuring Informatica Administration Console for Repository services, Integration services, Reporting services and Metadata Manager.
- Designed ETL standards for Enterprise Data Warehouse.
- Experience in UNIX shell scripting, automation of ETL processes using Maestro, Tivoli.
- Excellent communication, interpretation, client interaction and problem solving skills.
- Sincere, well-organized, quick learner self-motivated team player and has experience in all phases of the Systems Life Cycle. Strong ability to work individually and manage the tasks based on business requirements.
TECHNICAL SKILLS
Operating Systems: UNIX, Linux 9.0, HP-UX 11.0, IBM AIX 5.3, SUN Solaris 9x, WindowsXP/2000/NT.
Languages: C, SQL, PL/SQL, UNIX (Korn, Csh) Shell Scripting.
ETL Tools: Informatica PowerCenter 5.1/6.2/7.1/8.1/8.6 , PowerExchange Change Data Capture (CDC), Ab-Initio 1.11/1.12/1.13 , Co>OperatingSystem 2.11/2.12/2.13 and DataStage 7.0
Reporting Tools: Data Analyzer 8.1/8.6, Business Objects 5.0 and Cognos 6.0.
Databases: Oracle 8i/9i/10g, TeradataV2R5/6, SQL Server 2000/2005 and DB2 8.1.
Data Modeling Tool: Toad 7.4/9.0, SQL Navigator 4.1 and SQL Assistant 7.1
Developer Tools: Test Director 7.6 and Quality Center 8.2/9.2.
Testing Tools: TIDAL 5.3, Crontab, Tivoli 5.0/8.3, Maestro, Autosys and Control-M.
Scheduling Tools: Harvest CM 5.1.1 and Visual Safe Source 6.0.
Version Control Tools: Unica Affinium Campaign 6.2.6 and Unica Affinium Detect 6.8.7/7.0.
Marketing Tools: Erwin 4.1
PROFESSIONAL EXPERIENCE
Confidential, Glen Mills, PA
Informatica Consultant
Responsibilities:
- Involved in Requirements Gathering, Analysis, Design, Development, Testing and Implementation for NUL and IMNC projects.
- Created mappings using different transformations like Expression, Sort, Aggregator, Lookup, Joiner, Router and Filter transformations to extract and integrate the data from SQL Server and Flat files.
- Reviewed Relational, Dimensional Model and Source to Target Mapping document.
- Developed Oracle PL/SQL Stored procedures
- Loading facts and dimensions from source to target data mart
- Worked with DBAs and Technology Architecture to optimize ETL performance
- Attended meetings to identify/clarify ETL project requirements
- Developed and implemented the error handling strategy for ETL.
- Created Unit test cases for ETL mappings to ensure functionality.
- Involved in managing domain, grids, nodes, services, folders, installing and applying patches, starting and stopping of Informatica server and services.
- Migrating data mappings to production and monitoring, troubleshooting and restarting batch processes using Informatica Power Center.
- Identify, communicate and resolve production issues, re-run/recovery Informatica workflows as required.
- Schedule backups of Informatica environment, restore Informatica environment when required.
- Maintain Informatica grid environment with multiple nodes (HA environment).
- Responsible for code migration from QA to Prod.
- Advice and assist support teams to move code from DEV to QA.
- Responsible for applying Informatica service packs and upgrades
- Maintain Informatica security (Users, groups, folder permissions)
- Review code, insure standards are followed and recommend for better performance.
- Advise management on potential ETL capabilities, which can optimize working environment.
- Ensures to follow standards, policies and procedures set by client and Informatica best practices.
- Created UNIX Shell Scripts for add/remove users through command line.
- Scripts to list all current login users in Informatica.
Environment: Informatica PowerCenter 8.1/8.6, PowerExchange Change Data Capture (CDC), Oracle 10g, SQL Server 2005, Erwin 4.1, AIX 5.3, Windows Vista, Business Objects, Cognos, Shell scripting, Crontab, TIDAL Scheduler, Toad, CVS and Visual Safe Source 6.0.
Confidential, Wilmington, DE
Sr.Informatica ETL Developer
Responsibilities:
- Responsible to review Data models and mapping documents.
- Analysis, Profiling, cleansing and data extraction from various source systems involving Flat files, Oracle and SQL Server databases.
- Design and development of Informatica ETL process for staging loads, loading of dimensions, facts and their corresponding data marts.
- Used ODBC connection to extract data from SQL Server.
- Used Persistent Cache for reusable lookup transformation.
- Used Persistent variables to extract data from DSL (Staging) to ODW (Enterprise Data Warehouse).
- Used Confirmed Dimensions.
- Created Materialized view for union of all products.
- Used Debugger to debug the Informatica mappings.
- Used Parameter file to load Balance tables.
- Created mappings for initial and incremental loads.
- Involved in Data Profiling/Data analysis.
- Used Mapping Parameters and Variables for ETL process.
- Involved in code reviews and code migrations.
- Worked on Dimensions for SCD1 and SCD2 columns using Dynamic lookup.
- Scheduled jobs in Tidal scheduler to run Daily/Weekly/Monthly.
- Created UNIX wrapper to load history and going forward for deposit product summary.
- Used Informatica Versioning for version control for repository objects.
- Involved in migrating the repository objects between development, testing (QA), and production systems with necessary approvals.
- Upgraded Informatica 8.1.1 to Informatica 8.6.0.
- Configured Data Profiling and Reporting Service in Informatica PowerCenter.
- Implemented Deployment Groups for code migrations.
- Configured Informatica Administration Console for Repository services, Integration services, Reporting services and Metadata Manager.
- Implemented Test Scenario methodology for Quality Assurance.
- Developed Oracle PL/SQL Stored procedures for Product Interest rate.
- Production Support involving enhancement of old data warehouse (OCB), informatica ETL, Oracle stored procedures & windows batch scripts.
- Development of UNIX shell scripts to cleansing source files, appending date and file archiving process.
- Used Visual Safe Source for the project documents to maintain version control.
- Created Migration document for the code migrations.
- Involved in Unit, System, Integration and UAT testing.
Environment: Informatica PowerCenter 8.1/8.6, PowerExchange Change Data Capture (CDC), Oracle 10g, SQL Server 2005, Erwin 4.1, AIX 5.3, Windows XP, Cognos, Shell scripting, TIDAL Scheduler, Toad, PL/SQL Developer and Visual Safe Source 6.0.
Confidential, Baltimore, MD
Sr.Informatica ETL Developer
Responsibilities:
- Involved in Analysis, Design, Development, Testing and Implementation for IDR Finder File and PDE projects.
- Created Technical Design deck for Finder File and PDE work.
- Loaded Dimensions and Fact tables into IDR data mart.
- Loaded Security Signature tables.
- Created mappings using different transformations like Expression, Sort, Aggregator, Lookup, Joiner, Router and Filter transformations to extract and integrate the data from Teradata and Flat files.
- Used CMS ETL Standards for transformations, sessions and workflows.
- Used Teradata FastLoad and MLoad external loader in Informatica.
- Created Sessions and Workflows.
- Used Pushdown Optimization to increase performance.
- Created generic Shell Script for File Watcher for Beneficiary, Contract, Party and Prod NDC Finder Files.
- Used Parameter file.
- Created generic Shell Script for Sending Success/Failure email report with User Name, Finder File Name, Email Id, Rows Received, Sequence Number, Rows loaded and Rows Rejected.
- Created Teradata BTEQ scripts to delete rows if Finder File End Date is less than date.
- Created Test Cases, Test Data, Test Scenarios and tested successfully.
- Involved in Unit, System, Integration and UAT testing.
- Informatica Admin Activities.
- Involved in migrating the repository objects between development, testing, integration and production systems with necessary Manager’s approvals.
- Involved in creation of projects on the UNIX ETL servers.
- Involved in Repository Administration - creation of Folders, Users, User Groups, and giving privileges.
- Involved in sending Informatica downtime notification emails to Informatica users.
- Involved in Informatica 8.1.1 installation and configuration on Linux (Z9).
- Installed and configured Data Profiling in Informatica PowerCenter.
- Documented the installed steps for the future reference.
- Involved in performance tuning of ETL transforms and workflows.
- Involved in Informatica 8.6.0 installation and configuration on Solaris 10.
- Verified applications with CMS ETL Standards before Folder Migrations.
- Generated the reports using the Data Analyzer.
Environment: Informatica PowerCenter 8.1/8.6, PowerExchange Change Data Capture (CDC), Data Analyzer 8.1/8.6, Oracle 10g, Teradata V2R6, TTU 8.2, DB2 8.1, Erwin 4.1, Sun Solaris 9, Linux, Windows XP, Microstrategy 8.1, Cognos, Shell scripting, Crontab, Tivoli, Toad and Teradata SQL Assistant 7.1.
Confidential, Wilmington, DE
ETL Developer
Responsibilities:
- Understanding the Conceptual design.
- Participated in the development cycle including analysis, design, build and test.
- Participated in full testing cycle including bug fixes(Assembly, System and Production)
- Involved in developing, implementing, and continuously updating best practices and standards as they relate to ETL in the client environment.
- Converting business requirements into technical specifications, and detailed mapping documents.
- Used Ab Initio/Informatica as ETL tool to pull data from source systems, cleanse, transform & load data into database (Customer Data Mart).
- Implemented the Change Data Capture extensively to send only the delta records to the target systems.
- Loading facts and dimensions from source(ADS, EDW) to target Customer Data Mart
- Optimized/Tuned Ab Initio graphs for better performance and efficiency.
- Tuned performance of SQL Queries, Inputs/Sources, Outputs/Targets and Unix wrapper.
- Developed various UNIX scripts and some of the scripts
- Archive the files and purge them after appropriate retention period.
- FTP the files and implement the Hand Shake Mechanism.
- Create the dynamic File List that is being processed during the ETL execution.
- Merge the Header and Line items from different files and create new files based on common field - Document ID.
- Executes the SQL statements based on the requirements.
- Created PL/SQL procedures and the queries as of the requirements and to resolve the data issues.
- Involved in creating of test plans, migration plans for moving the application from development, test, stage and then into production environment.
- Worked with the testing team to validate the business logics and to move into production.
- Setup the batch-processing of Shell Scripts as per the schedules defined by the business people.
Environment: Ab Initio 1.13, Informatica 8.1, Oracle 9i/10g, DB2, Flat Files, UNIX (IBM AIX 5.0), Shell Scripting, Quality Center 9.2, Test Director 7.6, Tivoli Scheduler 5.0/8.3, Harvest CM 5.1, Affinium Campaign 6.2.6, Affinium Detect 6.8.7/7.0 and SQL Navigator 4.1 and Windows 2000.
Confidential, Austin, TX
Data Warehouse, Ab Initio Developer
Responsibilities:
- Analyzing the Data Model.
- Understanding and reading Business rules from mapping document.
- Used Ab Initio as ETL tool to pull data from source systems, cleanse, transform & load data into databases ( Confidential Data Warehouse).
- Replicate operational tables into staging tables, Transform and load data into warehouse tables using Ab Initio GDE and responsible for automating the ETL process through scheduling and exception-handling routines
- Implemented Data Parallelism through graphs, which deals with data, divided into segments and operates on each segment simultaneously through the Ab-Initio partition components to segment data.
- Implemented a 8 way multifile system that is composed of individual files on different nodes that are partitioned and stored in distributed directories (using Multidirectories).
- Responsible to configure the db configuration file to connect to the ODS and DSS to Load the data.
- Used Ab Initio components like Input File, Output File, Input Table, Output Table, Lookup, Reformat, Uncompress, Replicate, Filter by Expression, Join, Gather, Sort, Dedup Sorted and Run SQL.
- Used functions in Teradata like CAST, CASE, COALESCE, NULLIF, EXTRACT etc.
- Created Teradata macros.
- Created and Tested Ab Initio graphs.
- Experience in Toad and VSS (Visual Source Safe).
- Configured Ab Initio graphs in wrapper script.
- Scheduling the graphs/wrapper scripts in the Control-M.
- Involved in data validation
- Development of Test cases/Test plan/Test reports for Unit Testing/System Testing.
Environment: Ab Initio 1.11.1, Teradata V2R5, Teradata Utilities (Queryman, BTEQ, Fast Load), Korn Shell Scripting, Control-M Scheduler, Oracle 8i/9i, Co>Operating System 2.11.1, D3, Red Hat Linux Advanced Server release 2.1AS/i686 and Windows XP.
Confidential, Atlanta, GA
Informatica Developer
Responsibilities:
- Involved in the Analysis, Design and Development of Data warehousing solutions and in developing strategies for Extraction, Transformation and Loading (ETL).
- Used INFORMATICA as an ETL tool to extract data from source systems(OLTP) to target system (Data Warehouse/Data Mart).
- Extracted information from various sources, which included flat files and RDBMS and applied specific business logics to the source data.
- Migrated data from SQL Server (Space Application) to Oracle (Rocket Application)
- Identification of heterogeneous data sources (i.e. SQL Server, MS Access, Flat files and Oracle).
- Created Source to Target Matrix document that also lists the transformation rules
- Used Normalizer transformation to convert columns value into rows
- Created around 60 mappings from 30 source(SQL Server) tables to 50 target(Oracle) tables
- Provided ETL repository administration services and enforce ETL object reusability .
- Facilitate ETL-related data quality management
- Worked with DBA and Technology Architecture to optimize ETL performance
- Attended meetings to identify/clarify ETL project requirements
- Develop and implement the error handling strategy for ETL.
- Used Staging tables to achieve Business rules
- Created Post SQL to delete rows from staging tables
- Loading facts and dimensions from source to target data mart
- Unit test ETL mappings to ensure functionality.
- Validated the transformation logic in the ETL mappings according to the pre-defined ETL standards.
- Experience in PL/SQL Programming (Stored Procedures, Functions and Triggers).
- Tasked with performance tuning of Informatica Environment.
- Employed data transformations including Slowly Changing Dimensions Type II Date Range.
- Created various tasks like Assignment, E-mail, Control and Decision etc…
- Created mappings using different transformations like Expression, Aggregator, Lookup, Router, Update strategy, Filter and Stored Procedure transformations to extract and integrate the data from different databases and files. Also created reusable transformations like Lookup, Expression and Stored-Procedures.
- Created and Modified Reusable Transformations and Mapplets.
- Experienced in Repository Administration, Backups, creation of User Groups, and Tuning Informatica mappings/sessions.
- Created worklet/workflows using tasks like decision, Timer, Event raise, event wait and command to meet the user requirements.
- Development of Test cases/Test plan/Test reports for Unit Testing/System Testing.
- Developed and scheduled Workflows using task developer, worklet designer, and workflow designer in Workflow manager and monitored the results in Workflow monitor.
- Used PMCMD commands of Informatica in UNIX scripts to schedule sessions and jobs.
- Involved in migrating the repository objects between development, testing and production systems with necessary Manager’s approvals.
- Involved in Knowledge transfer to the users to access and run the jobs Using Informatica Workflow Manager.
- Used Maestro to schedule the workflows using shell scripts to run automatically.
- Involved in production support and documented any kind of issues encountered.
Environment: Informatica PowerCenter 5.1/6.2, Brio, Oracle 9i, SQL Server 2000, Erwin 4.1, HP-UX, Windows 2000, Shell scripting and MAESTRO Scheduler.
Confidential, Boston, MA
Informatica Developer
Responsibilities:
- Analyzed the business and functional requirements and translated them into technical specifications and data rules required for the ETL process.
- Imported various Application Sources, created targets and transformations using Informatica Designer (Source analyzer, Warehouse developer, Transformation developer, Mapplet designer, and Mapping designer).
- Responsible for collecting the data source information from all the legacy systems and existing data stores.
- Involved in Data Extraction, Transformation and Loading from source systems(OLTP) to ODS.
- Developed complex mappings using multiple sources and targets in different databases, flat and XML files.
- Used various transformations like Unconnected /Connected Lookup, Aggregator, Expression, Joiner, Custom Sequence Generator, Router etc.
- Scheduled Sessions/Batches to load the data into Confidential Data Mart.
- Used PMCMD commands of Informatica in UNIX scripts to schedule sessions and jobs.
- Involved in migrating the repository objects between development, testing and production systems with necessary Manager’s approvals.
- Involved in Knowledge transfer to the users to access and run the jobs Using Informatica Server Manager.
- Involved in production support and troubleshooting data quality and integrity issues
- Performed Debugging and performance tuning of Mappings and error handling
- Universe design.
- Generating reports using various Business Objects user module.
- Responsible for the Designing the universe by creating the Business Objects data model selecting/joining tables, indicating cardinalities, creating aliases to resolve the loops, subdividing into contexts and creating the objects which are grouped into classes.
- Generated the reports for the web users using the Web Intelligence 2.6 Info View.
- Development of Test cases/Test plan/Test reports for Unit Testing/System Testing.
Environment: Informatica PowerCenter 5.1, Business Objects 5.1, WebI 2.6, Erwin 4.1, Oracle 8i,SQL Server 2000, Citrix 6.3, Solaris and Windows 2000.