- 7+ Years of experience in Information Technology with a strong background in Database development and Data warehousing and ETL process using InformaticaPowerCenter9.x/8.x and Informatica IDQ 9.x. Repository Manager, Repository Server, Workflow Manager & Workflow Monitor.
- Good knowledge of Data modeling techniques like Dimensional/ Star Schema, Snowflake modeling using Erwin Data Modeling Tool.
- Worked extensively on Data Profiling and Data Validation to check Data Quality before providing First Cut of Data to MDM team.
- Knowledge in Full Life Cycle development of Data Warehousing.
- Strong understanding of OLAP and OLTP Concepts.
- Good experience with Installation and Configuration for domain repository service and Integration service.
- Experience in working both Waterfall & Agile Methodologies.
- Expertise in RDBMS concepts, with hands on exposure in the development of relational database environment using SQL, PL/SQL, Cursors, Stored Procedures, Functions and Triggers.
- Strong with relational database design concepts.
- Having solid Experience in Informatica and Teradata mix in Enterprise distribution center environment. Having solid involvement in utilizing Teradata utilities like TPT, FASTLOAD, MULTILOAD and BTEQ scripts.
- Performed data validation by Unit testing, integration testing and System Testing.
- Understand the business rules completely based on High Level document specifications and implements the data transformation methodologies.
- Extensive involvement in composing UNIX shell scripts, Perl Scripts and computerization of the ETL forms utilizing UNIX shell scripting.
- Experience in utilizing Automation Scheduling instruments like Autosys and Control - M.
- Extensive experience in managing teams/On Shore-Off Shore Coordination/Requirement Analysis/Code reviews/Implementing Standards.
- Good relational abilities with solid capacity to connect with end-clients, clients, and colleagues.
- Flexible, enthusiastic and project oriented team player with solid communication and leadership skills to develop creative solution for challenging client needs.
- Able to work independently and collaborate proactively & cross functionally within a team.
- Hands-on experience across all stages of Software Development Life Cycle (SDLC) including business requirement analysis, data mapping, build, unit testing, systems integration and user acceptance testing.
ETL Tools: Informatica PowerCenter 9.x/8.x, Power Exchange 9.x/8.x, Informatica IDQ 9.x,Informatica MDM,Informatica MDM Hub Console.
Databases: Oracle 11g/10g/9i, MS SQL Server 2012/2010, Teradata 13/12, DB2 & MS Access
Languages: C, SQL, PL/SQL, HTML, JAVA, UNIX Scripting & Python Scripting
Other Tools: Toad, SharePoint, Putty, GIT, MATT, Autosys and Control-M.
Reporting Tools: Cognos, Tableau.
Operating Systems: Linux, UNIX, SUN Solaris, Windows7/XP/2000/98
Confidential, Richmond, VA
Senior Informatica /IDQ Developer
- Involved in requirement analysis, ETL design and development for extracting data from the source systems like Oracle, flat files, XML files and loading into EDW (Enterprise Data Warehouse).
- Responsible for design and implementation needed for loading and updating the warehouse.
- Worked with analysts and data source systems experts to map requirements to ETL code.
- Developed complex mapping logic using various transformations like Expression, Lookups (Connected and Unconnected), Joiner, Filter, Sorter, Router, Update strategy and Sequence generator.
- Converted functional specifications into technical specifications (design of mapping documents).
- Worked on developing UNIX Scripts for job automations.
- These extracts contained changes after the comparison between the MDM EPIC data brought into MDM.
- Involved in massive data profiling using IDQ (Analyst Tool) prior to data staging.
- Used Autosys, CRON to schedule jobs.
- Created deployment groups to deploy objects.
- Worked with Session logs and Work flow logs for Error handling and troubleshooting in Dev environment.
- Worked on data cleansing using the cleanse functions in Informatica MDM.
- Efficiently used Informatica Workflow Manager, Workflow monitor to create, schedule and control Workflows, Tasks, and Sessions
- Deployed mappings and mapplets from IDQ to power center for scheduling.
- Used IDQ and Informatica PowerCenter to cleanse the data and load into target database
- Created scripts for better handling of incoming source files such as moving files from one directory to another directory and extracting information from file names, such as date, for continuously incoming sources.
- Performance tuning, maintain and fix production issues of existing code. Modify existing code as per the new business requirements.
- Wrote complex SQL scripts to avoid Informatica joiners and Look-ups
- Coordinated with DBA’s in resolving the database issues that lead to production job failures.
- Used Debugger to test the mappings and fixed the bugs.
- Fixed the Issues that come out of Integration Testing.
- Created work tables, global temporary tables, volatile tables as part of developing the script/code.
- Development of scripts for loading the data into the base tables in EDW using FastLoad, MultiLoad and BTEQ utilities of Teradata.
- Worked on Teradata SQL, BTEQ, MLoad, FastLoad, and FastExport for Ad-hoc queries, and build UNIX shell script to perform ETL interfaces BTEQ, FastLoad orFastExport. Created numerous Volatile, Global, Set, MultiSet tables.Created batch jobs for Fast Export.
- Created shell scripts for Fast export and Fast load.
- Migrated code/objects from the development environment to the QA/testing environment to facilitate the testing of all objects developed and check their consistency end to end on the new environment.
- Involved extensively in Unit testing, integration testing, system testing and UAT.
Confidential, Lewisville, TX
Senior Informatica/ IDQ Developer
- Worked with ETL Architects and Senior developers for understanding the requirements.
- Involved in massive data profiling using IDQ (Analyst tool) prior to data staging.
- Created complex mappings in PowerCenterDesigner using Aggregate, Expression, Filter, Sequence Generator, Update Strategy, Rank, Joiner and Stored procedure transformations.
- Defined Business rules in Informatica Data Quality (IDQ) to evaluate quality of data by creating cleanse processes to monitor compliance with standards and also identified areas for data quality gaps and assist in resolving data quality issues.
- Involved in implementing the Land Process of loading the customer Data Set into Informatica MDM from various source systems.
- Identified and fixed the Bottle Necks and tuned the Mappings and Sessions for improving performance. Tuned both ETL process as well as Databases.
- Performed CDC based on MDM last update date, which was indexed and maintained by MDM.
- Developed shell scripts, PL/SQL procedures, for creating/dropping of table and indexes of performance for pre and post session management.
- Implemented slowly changing dimensions - Type I, II &III in different mappings as per the requirements.
- Migrating mappings, workflows and parameter files from development to production.
- Used Cognos 10.1 BI Framework Manager to Build Models (Semantic layers, Cardinality, Relationships, Query subjects), Packages and publish packages to Cognos connection and implemented security for the packages.
- Created complex reports using Cognos 10.1 Report Studio and Ad-hoc reports using Query Studio.
- Developed complex reports from lists, Cross-tabs through Complex dashboards, Conditional formatting, Drill through, & master/detail reports for End users.
- Developed Framework Manager Models by following best practices and also modified to in corporate the new data elements as per the business needs.
- Developed Standard Reports, Charts, and Drill through Reports, Master-detail & conditional formatting reports using Report Studio and Query Studio.
- Developed multidimensional models (DMR) and publishing packages using framework manager and then generated reports from these packages.
- Created prompts, Calculations, Filters, Developing Prompt pages and Conditional Variables using Report Studio.
- Tested and validated the Report output against database to measure performance of reports.
- Worked on creating Unit testing documents.
- Utilized capabilities of Tableau such as Data extracts, Data blending, Forecasting, Dashboard actions and Table calculations. Created Blended views of data from various schemas and data sources
Environment: s: Informatica PowerCenter 9.6.2, Informatica IDQ 9.6.2,Tableau Desktop 9.0,Oracle 11g, MS-Excel, UNIX Shell Scripting, WinSCP.
Informatica IDQ Developer
- Worked on Designer tools like Source Analyzer, Target Designer, Transformation Developer, Mapping Designer and Workflow Designer.
- Worked with Informatica Data Quality 9.6 (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, porting and monitoring capabilities of IDQ 9.6.
- Involved in Debugging and Performance tuning of targets, sources, mappings and sessions.
- Profiled the data using Informatica Data Explorer (IDE) and performed Proof of Concept for Informatica Data Quality (IDQ).
- Deployed mappings and mapplets from IDQ PowerCenter for scheduling and some of them exposed as web service.
- Developed various Mappings, Mapplets, Workflows and Transformations for flat files and XML.
- Created and Configured Workflows, Worklets, and Sessions to transport the data to target warehouse Netezza tables using Informatica Workflow Manager.
- Used Transformations like Look up, Router, Filter, Joiner, Stored Procedure, Source Qualifier, Aggregator and Update Strategy extensively.
- Responsible for determining the bottlenecks and fixing the bottlenecks with performance tuning using Netezza Database.
- Created Netezza Sql scripts to test the table loaded correctly
- Utilized the Informatica Data quality management suite (IDQ and IDE) to complete initial data profiling and alsoto identify and merge customers and addresses.
- Cleansed, standardized, labeled and fix the data gaps in IDQ where it checks with reference tables to fix major business issues.
- Profiled the data using Informatica Data Explorer (IDE) and performed Proof of Concept for Informatica Data Quality (IDQ).
- Created various rules in IDQ to satisfy the Completeness, Conformity, Integrity and Timeliness.
- Exposure to Informatica B2B Data Transformation that supports transformation of structured, unstructured, and semi-structured data types while complying with the multiple standards which govern the data formats.
- Designed Mappings using B2B Data Transformation Studio.
- Experience in Systems Integration using Informatica B2B Data Exchange.
- Tuned performance of Informatica PowerCentersession for large data files by increasing block size, data cache size, sequence buffer length and target based commit interval.
- Environments: InformaticaPowerCenter9.6, Informatica IDQ 9.6,Oracle 11g,Netezza 4.2,UNIX, FTP, Toad.
Confidential, Los Angeles, CA
- Designed ETL's to extract data from COBOL Files and update the EDW with the Patient related Clinical information. This includes a one-time history load and subsequent daily loads.
- Involved in running shell scripts for the different environments with the same shell scripts.
- Worked on PowerExchange for importing source data to the Power center.
- Created mappings to read COBOL source file and write it in ASCII file format. Target file is same as source structure.
- Generated complex Transact SQL (T-SQL) queries, Sub queries, Co-related sub queries, Dynamic SQL queries etc.
- Performance Tuning of Stored Procedures and SQL queries using SQL Profiler and Index Tuning Wizard in SSIS.
- Created sprint plan for developers responsibilities and scheduling sprint plan in version one.
- Fixed defects in different environments for platform code.
- Developed and deployed SSIS packages for ETL from OLTP and various sources to staging and staging to Data warehouse using Lookup, Fuzzy Lookup, Derived Columns, Condition Split, Term, Slowly Changing Dimension and more. Performed ETL mappings using MS SQL Server Integration Services.
- Extracted and reviewed data from heterogeneous sources from OLTP to OLAP using MS SQL Server Integration Services (SSIS).
- Worked with Session logs and Workflow logs for Error handling and troubleshooting in Dev environment.
- Executed conversion maintenance on existing legacy system.
- Created mapping documents with business rules using MATT Tool.
- Developed professional reports in Report Studio and ad hoc reports in Query Studio.
- Created list reports, cross tab reports, chart reports and reports with features like conditional formatting, Page break, Master Detail, drill through, drill up, drill down and have extensively used the other features in Cognos 8.
- Developed Unit Test Cases, documents for report validation and testing after interacting with the end users.
- Developed filters, calculations, prompts, conditions, and created various reports, using CognosReport Studio for users.
- Created jobs to schedule reports in Cognos connection.
- Experience in developing new complex reports and customizing existing reports and change request processing.
- Strong Experience in Developing Transformer Models, cubes, views.
- Worked on GIT Bash for code check in into the different environments using GIT commands.
- Used Spash to verify checkin code in different environments.
- Worked with creation of Users, Groups, Roles and grant privileges to the users. Create folders, Relational and Application connections and configure the ODBC connectivity.
- Environment: InformaticaPowerCenter9.5, Power Exchange 9.5, MS SQL Server 2012, SSMS, SSIS,UNIX, Data Marts, UNIX Shell Scripting, MATT, Spash, Version one and GIT.
- Developed various Mappings, Mapplets, and Transformations for the Data warehouse.
- Involved in creation of Data Warehouse database (Physical Model, Logical Model) using Erwin data modeling tool.
- Extracted data from Teradata in to Informatica Power Center version and develop the code using different transformations and Loading to Landing area Teradata.
- Worked on loading of data from several flat files sources using Teradata MLOAD & FLOAD in Teradata.
- Set up batches and sessions to schedule the loads at required frequency using PowerCenterWorkflow manager.
- Used Teradata manager, Index Wizard and PMON utilities to improve performance.
- Extensively worked on Autosys to schedule the jobs for loading data.
- Multiload, BTEQ, created & modified databases, performed capacity planning, allocated space and granted rights for all objects within databasesetc.
- Worked on Power Exchange for change data capture (CDC).
- Worked on Teradata RDBMS using FASTLOAD, MULTILOAD, TPUMP, FASTEXPORT, MULTILOAD EXPORT, Teradata SQL and BTEQ Teradata utilities.
- Creating and modifying MULTI LOADS for Informatica using UNIX and Loading data into IDW.
- Designed and Developed Oracle PL/SQL and UNIX Shell Scripts, Data Import/Export.
- Responsible for the Data Cleansing of Source Data using LTRIM and RTRIM operations of the Expression Transformation.
- Involved in Informatica Data Masking & Data Subset Data Mapping.
- Defined the program specifications for the data migration programs, as well as the necessary test plans used to ensure the successful execution of the data loading processes.
- Developed QlikView dashboards using charts (straight table, pivot table, line, combo Chart etc.), list Boxes, multi Box, input field, table box, statistics box etc.
Environment: Informatica PowerCenter9.1, Power Exchange 9.1, Teradata 13.10, Oracle 11g, Data Marts, Erwin Data Modeler 4.1,UNIX Shell Scripting, Data Modeling, Autosys.