Sr. Etl Informatica Consultant Resume Profile
MarylanD
SUMMARY
- Certified Informatica developer with extensive IT industry Over 9 years of experience working with full life-cycle software development abstract concept, requirements, design, development, testing and deployment projects using concepts of data warehousing and client/server technologies.
- Expertise lies in ETL development ,unit testing, scheduling and production support using Data warehousing tools like Informatica Power Center 9.1/8.6/8.5/8.1/7.1/6.2, Power Mart 9.1/8.6/8.5/8.1/7.1/6.2, IDQ, OWB, Oracle 11g/10g/9i, Erwin.
- Solid hands on experience with Data warehousing tools, Data Analysis, ETL Development, Data Mapping, Unit Testing, Migration, Conversions, and Process Documentation.
- Proven ability to install, administer, configure, extract, transform and load data using Informatica.
- Extensive experience working in debugging, Unit testing System Integration testing and in data quality.
- Knowledge of dimensional modeling has been involved in creation of Star Schema and Snowflake dimensional data marts, performance tuning of sources, mappings, targets and sessions, mapplets/reusable transformations, configuring the Informatica Server and scheduling the sessions, creating fact and dimensional tables, physical and logical data modelling.
- Worked extensively on slowly changing methodologies SCD , error handling and performance tuning.
- Wide experience working on migration projects involving migration of mappings from various tools to Informatica. She also has experience in process modeling dimensional modeling for Business requirements, technical designs.
- She has experience in working with Agile/Scrum methodologies.
- Has good understanding on MDM concepts.
- Has functional experience in Media Arbitron , HealthCare/Insurance CareFirst BCBS K12 Education Systems MSDE .
- Good experience in mentoring the team members and providing valuable suggestions in critical issues.
- Good communication, inter-personal skills and analytical skills.
- Has functional experience in, K-12 Education, Pharmaceutical, Insurance and communication domains.
TECHNICAL SKILLS:
ETL Tools 9years | Informatica Power Center 9.1/8x/7.1/6.2, Power Mart 9/8x/7.1/6.2, IDQ,OWB, Oracle Data profiler |
Databases 9years | Oracle 11g/10g/9i, MS SQL Server 7.0/8.0 |
Languages 9years | SQL, PL/SQL, UNIX Shell scripting, C, C |
Packages 9years | MS Office |
Design Tools 5Years | Erwin, Visio |
Reporting Tools | Business Objects 6 |
Environment | UNIX,Win98/2000/NT/XP, 2007,2008 |
PROFESSIONAL EXPERIENCE
Confidential
Role:
Confidential Department of Education's Maryland Longitudinal Data System project that involves the new development of the creation of Enterprise Data marts based on the various Student and Course Business Logics. EDW Enterprise Data Warehouse is the dimensional model that has multidimensional data coming from various sources like Flat file, Oracle, SAS. This data is made as a source and is moved to separate data marts based on new/modified business logic. The reports are extracted as flat files and Oracle business intelligence for verification/analysis by the managers. The report helps in producing detailed activity analysis and resource plans, to understand the cause-and-effect relationship, organize strategic thinking and performance measurement, use continuous, collaborative forecasting to manage the plan in real-time, and clearly communicate strategy and success measures.
Confidential
- Career Technology Education CTE includes the collection of Maryland students who took the CTE course and those who stands under the status of completers and or concentrators.
- Student Clearing House deals with collection of Graduate student information from MSDE and sending them to National Clearing House who tracks the information of students who went to colleges within Maryland.
- Involved in full SDLC process of the project which consist of requirements, design, implementation and support..
Confidential
Early Childhood project involves multiple phases. The MMSR which involves collection of pre-kindergarten students' information across years and pushing them into ods tables after doing extensive cleansing of sasid, uniqueness of data and then doing dashboard reports for analysis. The second phase involves housing of Provider, Accreditation, and Facility data by doing Type 2 methodologies and further analysis via dashboards. Involved in full SDLC process of the project which consist of requirements, design, implementation and support
Confidential
- P61/ State Stat is the collection from Maryland Governor's Unit GDU . This collection includes data from various agencies like Department of Public Safety, Department of Police, Patuxent.
- This monthly data is collected in various spreadsheets for the various measures of the individual departments.
- This data collection has to be ported into a database as a star model from which reports will be extracted for data analysis. Involved in requirement gathering, analysis design, and implementation and testing of the project. Involved in full SDLC process of the project which consist of requirements, design, implementation and support.
Responsibilities:
- The data is collected as various spread sheets from the CTE team and then staged after doing general cleaning.
- This data is then pushed into the ods and reporting tables from which reports are generated for dashboard analysis by the CTE and other affiliated teams.
- Developed various complex ETL code using Informatica, Oracle Warehouse Builder, PL/SQL and Unix scripts
- Created logical and physical data structures created in Erwin and executed ddl scripts for generating tables.
- Gathered requirements from clients for collecting the KPI requirements.
- Worked for Governor's Data Unit GDU
- This data is then ported into a set of ODS and RPT tables which are used for dashboard analysis.
- Involved in full SDLC process of the project which consist of requirements, design, implementation and support.
- Worked extensively on Informatica Source Qualifier, Workflow Manager, Monitor, debugger,, indicator files, mapping parameters and variables.
- Extracted data from various sources like Oracle, flat files and developed Informatica/ OWB mappings to validate the various flat file formats, column errors.
- Used Incremental Delta Loading and Change Data Capture CDC techniques in loading data.
- Executed extensive Unit Testing using SQL Scripts.
- Worked on R Code in processing files to calculate the growth percentile values.
- Created Technical documentation for validation rule, error handling and test strategies
- Mentored team members on best practices.
Environment: Informatica Power Center /Power Mart 9.1/8.6,Oracle Warehouse Builder, Oracle Data Profiler, Oracle 11g, SQL, PL/SQL, RCode, Shell Scripting, HP-Unix, Erwin
Confidential
Role: Lead Informatica Consultant
Confidential Telecommunications warehouse data Model deals with analyzing the performance of individual markets in various countries and providing a uniform approach to gather metrics within each market.
Responsibilities:
- Gathered requirements from end users and OBIEE/Microstrategy developers for collecting the KPI requirements.
- Worked through logical and physical data structures created in Erwin and executed ddl scripts for generating tables.
- Developed various simple and complex ETL code using Informatica, PL/SQL and Unix scripts
- Utilized Informatica debugger, indicator files, mapping parameters and variables,
- Extracted data from various sources like Oracle, xml, flat files and developed Informatica mappings to validate the various flat file formats, column errors.
- Used Incremental Delta Loading and Change Data Capture CDC techniques in loading data.
- Performed Unit Testing using SQL Scripts.
- Created Technical documentation for validation rule, error handling and test strategies
- Mentored team members on best practices.
Environment: Informatica Power Center /Power Mart 9/8.6, Oracle 11g, SQL, PL/SQL,, JMS/JNDI, XML, Shell Scripting, HP-Unix, Erwin
Confidential
Role:Sr. Informatica Consultant
Confidential Enterprise Data Warehouse project involves the new development of the creation of Enterprise Data mart based on the various Insurance Business Logics. EDW Enterprise Data Warehouse is the dimensional model that has multidimensional data coming from various sources like Facets, Oracle, Mainframe,. This data is made as a source and is moved to separate data marts based on new/modified business logic. The reports are extracted as flat files and sent to Verisk for verification/analysis by the managers. The report helps in producing detailed activity analysis and resource plans, to understand the cause-and-effect relationship between cost and behavior, organize strategic thinking and performance measurement, use continuous, collaborative forecasting to manage the plan in real-time, and clearly communicate strategy and success measures. Worked on Healthy Blue Product which sources Healthy Blue product member information from member tables and send to Health Fitness.
Responsibilities:
- Analyzed the business process and created the design and technical document from the functional specifications.
- Developed Staging, Dimensions and Facts using Informatica.
- Extensively used various performance tuning techniques to improve job performance and SQL query performance.
- Loaded data from Oracle, flat files, MS SQL Server databases...
- Worked on using Source Analyzer, Mapping Designer, and Mapplet Designer Transformation Developer.
- Created implemented Source Qualifier, Expression, Filter, Connected and unconnected Lookup, Normalizer, Aggregator, Update Strategy, Shared Mapplets, reusable transformations to load the data from Source to Target using Designer.
- Used PL/SQL to create procedures and database triggers.
- Worked on an Agile and Scrum methodology in daily updating the status updates to managers and identifying the outstanding impediments.
- Created sessions with indicator files, mapping parameters and variables.
- Performed Unit Testing which validates the data is mapped correctly and System Integration testing which provides a qualitative check of overall data flow up and deposited correctly in Target Tables using SQL Scripts.
- Customized data by adding calculations, summaries and functions.
- Created UNIX Shell Script for creating the extracts.
- Created Informatica maps to validate the various flat file formats, column errors.
- Used debugger for error handling and to test the data flow.
- Worked on creation of JMS and JNDI connection on Informatica servers.
- Created and sent XML messages to JMS Queue through Tibco.
- Used the Informatica Designer, Workflow Manager, and Workflow Monitor Repository Manager.
- Assisted in mentoring other team members on best practices and skills. Created Technical documentation for validation rule, error handling and test strategies.
Environment: Informatica Power Center /Power Mart 8.5, Oracle 11i/10g/9i, SQL, PL/SQL, , JMS/JNDI, XML, Shell Scripting, HP-Unix
- Developed complex mappings with respective transformations to satisfy the complex business logics using Designer.
- Loaded data from Oracle, flat files, MS SQL Server databases.
Confidential is a positive approach to health care that fosters and rewards healthy lifestyles and promotes a strong, trusting relationship among patients. Project involves sourcing the Facets Members from the EDW and loading into the EDM staging table. Thereafter the eligible members are sent to the CIAM Fitness team through JMS queue as well as the data is ftp'd to Tumbleweed and to the internal drives.
Responsibilities:
- Analyzed the business process and created the design document from the functional specifications.
- Developed complex mappings with respective transformations to satisfy the complex business logics using Designer.
- Worked on using Source Analyzer, Mapping Designer, and Mapplet Designer Transformation Developer.
- Created implemented Source Qualifier, Expression, Filter, Connected and unconnected Lookup, Normalizer, Aggregator, SQL Transformation, Update Strategy, Shared Mapplets, reusable transformations to load the data from Source to Target using Designer.
- Loaded data from Oracle into JMS, XML and to flat file.
Environment: Informatica Power Center /Power Mart 8.5, Oracle 11i/10g/9i, SQL, PL/SQL, JMS/JNDI, XML, Shell Scripting, HP-Unix
Programmer Analyst
Confidential is an international media and marketing research firm primarily serving radio, cable, advertising agencies, advertisers, outdoor and out-of-home media and, through its Scarborough joint venture, broadcast television and print media. The Company has four main services: Measuring radio audiences in local markets in the United States and Mexico Measuring national radio audiences and the audience size of network radio programs and commercials providing application software used for accessing and analyzing media audience and marketing information data and providing consumer and media usage information services to radio, cable, advertising agencies, advertisers, outdoor and out-of home media, Internet broadcasters and, through its Scarborough joint venture, broadcast television and print media.
Responsibilities:
- Analyzed the business process and created the design document from the functional specifications.
- Arranged dimensions and measures as a Star model as per requirement. Created Physical and Logical Data modeling using Erwin.
- Installed and configured Informatica on UNIX and Windows XP/2000/NT platforms.
- Developed Staging, Dimensions and Facts using Informatica.
- Extensively used various performance tuning techniques to improve job performance and SQL query performance.
- Developed complex mappings with respective transformations to satisfy the complex business logics using Designer.
- Loaded data from Oracle, flat files databases.
- Developed Staging, Dimensions and Facts for Arbitron using Informatica.
- Worked on using Source Analyzer, Mapping Designer, and Mapplet Designer Transformation Developer.
- Created implemented Source Qualifier, Expression, Filter, Connected and unconnected Lookup, Normalizer, Aggregator, SQL Transformation, Rank, Update Strategy, Shared Mapplets, reusable transformations to load the data from Source to Target using Designer.
- Used PL/SQL to create procedures.
- Worked on an Agile/Scrum methodology in daily updating the status updates to managers and identifying the outstanding impediments.
- Worked closely with client management to identify and specify complex business requirements and processes.
- Created sessions with indicator files, mapping parameters and variables.
- Performed Unit Testing which validates the data is mapped correctly and System Integration testing which provides a qualitative check of overall data flow up and deposited correctly in Target Tables using SQL Scripts.
- Customized data by adding calculations, summaries and functions.
- Created UNIX Shell Script for file transfers.
- Created Shell scripts to create a File List which is used in Informatica as a Source.
- Used debugger extensively for error handling and to test the data flow.
- Used Incremental Delta Loading and Change Data Capture CDC techniques to load data.
- Used the Informatica Designer, Workflow Manager, and Workflow Monitor Repository Manager.
- Assisted in mentoring other team members on best practices and skills. Created Technical documentation for validation rule, error handling and test strategies.
- Arranged dimensions and measures as a Star model as per requirement.
- Analyzed the existing Business process and created the Designs using Informatica.
- Included in the migration of mappings from tool BODI to Informatica
- Confidential new ways to manage health. Their products span the continuum of care, from nutritional products and laboratory diagnostics through medical devices and pharmaceutical therapies. The comprehensive line of products encircles life itself - addressing important health needs from infancy to the golden years. Abbott has sales, manufacturing, research and development, and distribution facilities around the world.
- Used debugger for error handling and to test the data flow. Technical documentation
- Generated Reports to give better analysis to sales and marketing division. The data is extracted from various databases like Oracle and loaded into the target database i.e. Data Warehouse Tables using Informatica.
- Scheduled the Jobs in Informatica to run on various Business days.
- Installed and configured Informatica on Windows 2000 Server/NT/XP Professional.
- Analyzed the business process and created the design document from the functional specifications.
- Arranged dimensions and measures as a Star model as per requirement.
- Created Physical and Logical Data modeling using Erwin.
- Installed and configured Informatica on Windows 2000 Server/NT/XP Professional.
- Developed Staging, Dimensions and Facts for Abbott using Informatica.
- Extensively used various performance tuning techniques to improve job Query performance.
- Developed complex mappings with transformations to satisfy complex business logics using Designer.
- Loaded data from Oracle, flat files.
- Worked on using Source Analyzer, Mapping Designer, and Mapplet Designer Transformation Developer.
- Created implemented Source Qualifier, Expression, Filter, Connected and unconnected Lookup, Normalizer, Aggregator, SQL Transformation, Update Strategy, Shared Mapplets, reusable transformations to load the data from Source to Target using Designer.
- Created sessions with indicator files, mapping parameters and variables.
- Performed Unit Testing which validates the data is mapped correctly and System Integration testing which provides a qualitative check of overall data flow up and deposited correctly in Target Tables using SQL Scripts.
- Customized data by adding calculations, summaries and functions.
- Created Shell Script for validation checking of the Source Data Files.
- Created Informatica maps to validate the various flat file formats, column errors.
- Used debugger for error handling and to test the data flow.
- Used the Informatica Designer, Workflow Manager, Workflow Monitor Repository Manager.
- Assisted in mentoring other team members on best practices and skills.
- Created Technical documentation for validation rule, error handling and test strategies.
Environment: Informatica 8.5, Business Objects Data Integrator BODI , Oracle 9i, SQL, PL/SQL, Unix
Confidential
Software Engineer
Confidential products and want to get the status and details of all campaigns happening under different status like planned, proposed, in flight. The Customer uses a tool called Aprimo to enroll into each campaign and to go through different functionalities like approval, channels etc. Aprimo people will track this data using software called peta and from this peta they load data into Oracle tables. This data is moved into warehouse tables after applying cleansing and business logic using ETL tool Informatica. Using this data, we build a universe and generate
Responsibilities:
- Studied the business process and involved in requirement gathering.
- Interpreted logical and physical data models for Business users to determine common data definitions and establish referential integrity of the system using ERWIN.
- Used Star schema and Snowflake schema in relational, dimensional and multidimensional modeling.
- Installed and configured Informatica 8/7.1 on Windows 2000 Server/NT/XP Professional
- Extensively used various Performance tuning techniques to improve the map performance.
- Created the Sources definition from flat files and importing the target tables from the respective databases and created Reusable Transformations, Mapplets in shared folder.
- Developed the Mappings with Mapplets/ Reusable transformations to adhere to the business logic, using Designer.
- Created Expression, Lookup, Filter, Router, Normalizer, and Aggregator Transformations to load the data from source to target using Designer.
- Developed and scheduled the maps on Informatica Server which would initialize the workflows automatically.
- Performed Unit Testing which validates whether the data is mapped correctly and System testing which provides a qualitative check of overall data flow up and deposited correctly in Target Tables
- Generated advanced Reports to give better analysis to sales and marketing division. The data was extracted from flat file, as well as from various different databases like Oracle and loaded into the target database i.e. Data Warehouse Tables using Informatica-Power Mart/Power Center 7.1
- Used debugger for error handling and to test the data flow
- Used the Informatica Designer, Source Analyzer, Warehouse Designer and Mapping Designer
Environment: Informatica Power Center /Power Mart 7.1, Oracle 8i/9i, SQL, Toad, Win NT
Confidential is the world leader in manufacturing Scanners and Printers. The project has challenges in supporting applications belonging to multiple domains and technologies.Domains include- Business Management, CRM, and internet infrastructure platform.
Responsibilities:
- Analyzed the business process and created the design document from the functional specifications.
- Arranged dimensions and measures as a Star model as per requirement.
- Installed and configured Informatica on Windows 2000 Server/NT/XP Professional
- Created the source definition from flat files and imported target tables from the respective databases and created Reusable Transformations, Mapplets in shared folder.
- Modified the mappings with Mapplets/ Reusable transformations to satisfy the business logic, using Designer.
- Modified Expression, Lookup, Filter, Router, Normalizer, and Aggregator Transformations to load the data from source to Target using Designer.
- Developed and scheduled the maps on Informatica Server which would initialize the workflows automatically.
- Performed the Unit Testing that validates whether the data is mapped correctly and System testing which provides a qualitative check of overall data flow up and deposited correctly in Target Tables.
- The data is extracted from flat file, as well as from various different databases like oracle and loaded into the target database i.e. Data Warehouse Tables using Informatica-Power Mart/Power Center 7.1
- Used debugger for error handling and to test the data flow
- Used the Informatica Designer, Source Analyzer, Warehouse Designer and Mapping Designer
Environment: Informatica Power Center /Power Mart 7.1, Oracle 8i, 9i, SQL, PL/SQL, Toad, Win NT
Confidential
Market Demand Model
Confidential is mainly dealing with manufacturing of various parts of engines and their maintenance. The project involves forecasting of the engine parts, which may come back to them for maintenance after the warranty period and the cost involved in manufacturing of the various parts of those engines. The input flat files are ported to Oracle and later required values are calculated for the corresponding fields and populated to a set of Oracle tables. This is achieved by a set of Informatica maps.
Responsibilities:
- Installed and configured Informatica Power Center 6.1 client tools and connected with each database in Data Ware house using repository server.
- Used Repository manager to create Repository, User groups, Users and managed users by setting up their privileges and profile.
- Interpreted Logical and Physical data models for business users to determine common data definitions and establish referential integrity of the system using ERWIN.
- ETL for the Star schema using Informatica SQL. Implemented business data mart.
- Implemented Aggregator, Sorter, Router, Filter, Join, Expression, Lookup and Update Strategy, Normalizer, Sequence generator transformations.
- Used debugger to test the mapping and fixed the bugs. Developed reusable Mapplets and Transformations.
- Executed sessions, tasks, sequential and concurrent batches for proper execution of mappings.
- Used dynamic cache memory and index cache to improve the performance of Informatica server.
- Used the Workflow manager to create workflows and tasks, and also created worklets.
- Used the Workflow Monitor to monitor the job submission and progress.
- Tuned ETL procedures and STAR schemas to optimize load and query.
- Worked on Query Tuning. Created test plans and strategies for the unit testing, system testing and UAT.
- Scheduled and executed batch and session jobs.
Environment: Informatica 6.1, Power Center 6.1, Oracle 9i/8i, PL/SQL, WinNT, XP, Toad.