- 8 years of experience in system analysis, design, development of software applications in the field of ETL methodologies, business intelligence, data warehousing principles and architectures including concepts, designs and usage of data warehouses and data marts.
- Proficient in working on ETL design and Development using Informatica power center designer, repository manager, workflow manager, workflow monitor, metadata manager, data explorer and repository server admin console.
- Worked in the areas of data integration, migration, data warehouse business intelligence environment and in all phases of data warehouse life cycle, from gathering requirements to testing, implementing, developing and providing support.
- Experienced in business domains like marketing and sales load process, finance, investment banking, health care and manufacturing.
- Ability to interpret and implement complex business rules and good understanding in analyzing source system and underlying data to work on ETL.
- Strong understanding of dimensional modeling technique, multi - dimensional database schemas like star, snowflake schema, fact and dimensional tables and DW concepts. Strong Data analysis to ensure accuracy and integrity of data in the context of business functionality.
- Extensive knowledge in writing UNIX shell scripts and Views, Functions, PL/SQL procedures.
- Proven experience in complete life cycle of Design, Development, Documentation and Maintenance of Data marts and Data warehouse.
- Expertise in enterprise database integration and management among MS SQL Server, Oracle, MS Access, DB2, XML, Teradata and complex flat files into data staging area.
- Proficient in performance tuning of sources, transformations, targets, mappings, worklets, workflows and sessions.
- Expertise in loading data from fixed width, delimited flat files
- Experience with business intelligence tools COGNOS 8.0, Business Objects XI, OBIEE, Oracle BI apps
- Involved in design, developing reports and debugging on Reporting Tool.
- Well versed with Manual and Automated Testing methodologies and principles.
- A self-starter, target-oriented, self-disciplined and proactive professional with good communicational, interpersonal skills with ability to work effectively on multiple tasks both as individual contributor and team environments.
ETL Tools: Informatica Power Center 9.x/8.6.x/7.x, Power Exchange, Power Connect, Metadata Manager, Data Migration, Flat file System (Fixed width, Delimiter).
BI Tools: Cognos 8.0, Business Objects 6.5/XI (Web Intelligence XI), OBIEE 10.1.3.x
Data Modeling Tools: Dimensional Data Modeling, Star Schema, Snow-Flake Modeling, Fact and
Dimensions tables, Physical and Logical Modeling, Erwin 5.1/4.1
Databases: Oracle 11g/10g/9i, SQL Server 2005/2008, Sybase, MS Access 2000, MySQL, DB2, Teradata v2r6
DB Tools: TOAD 8.6, 9.1, SQL * Loader, SQL * Plus, SQL developer, Autosys, Control M.
Languages: C, C++, SQL, PL/SQL, UNIX Shell Scripting, VBA
Operating Systems: Windows XP/NT/2000/Vista/7, UNIXPROFESSIONAL EXPERIENCE:
Confidential, Malta, NY
- Responsible for analyzing, gathering requirements, scope and development planning.
- Managing day-to-day project activities and overall project management in the implementation of large scale data warehouse solutions.
- Work with end-users, database architects, domain experts, other program managers and SQA resources that are driven to deliver the highest-quality, leading-edge engineering applications.
- Work with internal design and fab teams, and directly with customers for defining and directing early product yield data warehouse solutions.
- Created STM (source to target mapping), Conceptual data models and SRS (software requirement specification) documents.
- Involved in developing complex ETL code to integrate barcode data of consumables and wafer data into data warehouse and linkage to the existing supplier data i.e. incoming quality data provided by supplier for every delivery of material.
- Worked on different transformations, slowly changing dimensions.
- Tuning Informatica Mappings and Sessions for optimum performance.
- Involved in unit testing, data quality check, data completeness.
- Created complex workflows, with multiple sessions, worklets with consecutive or concurrent sessions and other tasks like Event Raise, Event Wait, Decisions and Email to transport the data to data warehouse tables using Informatica Workflow Manager.
- Involved in Staff Meetings with different departments and Cross-team management to align/implement integration of different systems across the fabs.
- Support system after production rollout.
- Provided management updates on program milestones, project development roadmaps, and marketing collateral in support of GF's differentiated capabilities in these areas.
Environment: Informatica 9.1/9.5, Oracle 11g, Toad, MS SQL Server 2008, Visio professional 2010, PDF solutions Exensio, Windows XP, Qlikview.
Confidential, New Jersey
- Involved in business analysis, requirements gathering, functional/technical specification, designing and development of End-to-end process for Trading Data Warehouse
- Coordinate with new development and enhancements to the existing code. Perform impact analysis and well integrated with existing Informatica process and database design.
- Effectively communicate project expectations and ensure all commitment dates for key project/Jira deliverables are met.
- Worked on different asset classes trading data like fixed income, commodities, equities trades by converting VBA scripts to Informatica code and created views, store procedure in optimized manner to show reports front end on SharePoint.
- Developed complex ETL (Extract/Transform/Load) procedures to provide a way to update/insert/delete data from multiple heterogeneous sources.
- Used Workflow Manager for creating validating, testing and running sequential and concurrent batches.
- Worked on high volume dataset migration to SQL server.
- Involved in database back up, planning and creating etl process for data migration by transferring different formats and structural data.
- Performance tuning of Informatica components for daily and monthly incremental loading.
- Conduct and participate in code reviews, validation, testing and debugging.
- Worked on high volume of data and involved in performance tuning of ETL code to optimize session performance for session run during the available load window.
- Responsible for Troubleshooting ETL issues and bug fixes.
- Created UNIX shell scripts/File watcher jobs that are used check whether all related files are present in the specified directory thereby kicking-off Informatica jobs once all files are arrived.
- Involved in migrating ETL code from DEV to QA and QA to PROD by using Import/Export method.
- Provided complete production support, fix production issues. And Failure analysis for Informatica loads using error logs.
- Created and maintained several custom reports, universes for the client using Business Objects and generated reports using BO tools querying databases
- Extensive experience in creating executive reports with BI Publisher integrated with OBIEE.
- Created Run Books and completed documentation of database structures and processes
- Provide written status reports to executive management regarding project status, task and issues.
- Co-ordination with on shore and off shore teams.
Environment: Informatica Power Center 9.1, SQL Server 2008, Access 2000, Business Objects XI, OBIEE, SQL Developer, Autosys 4.5, UNIX, Windows XP, Erwin 4.0, Embarcadero Rapid SQL 7
- Involved in complete SDLC from business analysis, requirements collection from business/vendors and design, development, test and report.
- Designed ETL technical specifications to migrate data from existing PL/SQL PAR POS system to Oracle Data Warehouse BI2.0 project.
- Involved in the Dimensional Data Modeling to define entities and attributes that are required for the business process and coordinate with Database Administrator.
- Developed complex mappings and SCD type-I, Type-II and Type III mappings in Informatica to load the data from various sources using different transformations like Source Qualifier, Lookup (connected and unconnected), Expression, Aggregate, Update Strategy, Sequence Generator, Joiner, Filter, Rank and Router and SQL transformations. Created complex mapplets for reusable purposes.
- Pulled data from different sources like Database, XML, Flat files.
- Worked extensively on SQL Queries, Views and Store Procedures.
- Identified bugs in existing mappings by analyzing the pipeline data flow, evaluating transformations and fixing the bugs so that they conform to the business needs and redesign the existing mappings for improved performance.
- Responsible for the performance and tuning of the ETL application, including performing maintenance/upgrades on applications programs and systems as required.
- Responsible to ensure that all assigned systems and applications are defined, tested, backed up and have necessary recovery procedures for critical data.
- Created pre-session and post-session and UNIX shell scripts and mail-notifications.
- Designed the parallel and sequential session workflows and scheduled the workflows and sessions using workflow manager.
- Implemented command task, control task, event raise and event wait tasks within the workflows to stream line the session scheduling.
- Coordinated with team members in development process to integrate with the Test and Production Servers and helped in creating use cases.
- Developed, implemented and enforced ETL best practices standards using velocity methodology.
- Completed documentation of database structures and processes used to move data in and out of database for future development and expansion of database.
- Involved in creating reports using Oracle business intelligence tool OBIEE.
- Provide written status reports to management regarding project status, task and issues.
Environment: Informatica Power Center 8.6, Oracle 10g, OBIEE 10.1.3, SQL Developer, UNIX, Windows XP, Erwin 4.0
Confidential, Dayton, OH
- Used Informatica ETL to load data from flat files, which includes fixed-length as well as delimited files and SQL server to the Data mart on Oracle database.
- Used Informatica Data Quality to formulate data conversion and data cleaning rules.
- Performed Data Integration, Data Standardization using IDQ.
- Have implemented both Type 1 and Type 2 ‘Keep history’ (SCD).
- Performance tuning of Informatica components for daily and monthly incremental loading of around 50 tables.
- Developed mapping using reusable transformations, parameters and Variables.
- Created complex workflows, with multiple sessions, worklets with consecutive or concurrent sessions and other tasks like Timer, Event Raise, Event Wait, Decisions and Email.
- Implemented source and target based partitioning for existing workflows in production, to improve the performance and cut back the running time.
- Involved in creating test plans, test cases to unit test the mappings, sessions and workflows.
- Used ETL debugger extensively to identify the performance bottlenecks within the mappings.
- Created and used UNIX shell scripts which are used to copy input files from DEV environment to PROD environment, sending a mail once the process has successfully completed. Also notifying users if there is any failure in process specifying them cause of failure, for appropriate action to be taken for troubleshooting.
- Involved in building models, packages using Framework Manager and published the packages to Cognos Connection.
- Created advanced reports using Report Studio.
Environment: Informatica Power Center 8.5.1, COGNOS 8.4, SQL Server 2005, Sun Solaris OS, ERWIN 4.0, Autosys 3.0, IDE8.6.0, IDQ8.6.0, SQL Developer.
Confidential, Inc., Newark, NJ
- Developed BTEQ and Multi Load scripts to migrate tables from Oracle to Teradata.
- Sourced data from Teradata to Oracle using Fast Export and Oracle SQL Loader.
- Created load scripts using Teradata Fast Load and Mload utilities and procedures in SQL Assistant.
- Implemented Aggregate, Filter, Join, Expression, Lookup and Update Strategy transformations.
- Used debugger to test the mapping and fixed the bugs.
- Developed mappings using Mapping Designer to standardize data as per business requirements.
- Worked extensively on fine tuning SQL overrides for enhancing performance.
- Worked extensively with Mapping Designer to join Oracle and Teradata tables using Joiner transformation.
- Worked on developing workflows and sessions and monitoring them to ensure data (target file) is properly loaded into the warehouse.
- Used debugger extensively to identify the bottlenecks in the mappings
- Responsible for scheduling reports, error checking, production support, maintenance and testing of ETL procedures using Informatica session logs.
- Used Control-M for automatic job scheduling.
- Collect and link metadata from diverse sources, including relational databases, modeling tools, integration processes, enterprise applications, and mainframe systems, into a central catalog.
- Used Power Center Data Analyzer to generate Ad-hoc reports and modify accurate, relevant high quality reports.
- Extensively involved in making data flowing through mappings is consistent and organized.
Environment: Informatica Power Center 8.1.1, SQL, Teradata, Teradata SQL Assistant, BTEQ, Multi Load, Fast Load, Fast Export, Control-M, PL/SQL,UNIX, Windows NT.