Delivery & Support Lead Resume
Bentonville, AR
SUMMARY
- Highly skilled IT Professional with more than 16 years of experience in analysis, design, construction and deployment of DW projects for Retail, Telecom and Insurance domains.
- Having Cross Industry Experience and expertise in defining and executing Data Warehousing strategies that increase competitive advantage.
- Expert in implementing end - to-end solution using MS BI Stack, SQL Server, Integration Services (SSIS), Analysis Services (SSAS) and Reporting Services (SSRS). Versions 2012/2008 R2/2005.
- Extensive use of DAX (Data Analysis Expressions) functions for the Reports and for the Tabular Models.
- Having hands on experience in Microsoft Azure SQL Database.
- Hands on experience in ETL (Extraction, Transformation, and Loading) processes using Informatica Power Center 9.0.1/8.6/7. x.
- Well acquainted with Informatica Designer, Workflow Manager and Workflow Monitor Components.
- Expertise in Performance tuning (sources, mappings, targets and sessions) and Database Performance tuning.
- Experience in integration of various Operational Data Sources (ODS) with multiple feed/sources like VSAM files, Flat files, RDBMS etc.
- Having Expertise in ETL Full load, Delta load and Reconciliation process.
- Experience in IBM Cognos 10 BI (Framework Manager, Report Studio, Query Studio, Metric Studio, Analysis Studio and Cognos Connection) and Report Net1.1.
- Hands on Experience in Cognos TM1 9.2 (Architect, Perspective, Turbo Integrator\TI, Chores).
- Design and develop metadata model in Cognos Framework Manager.
- Taken participation in Planning, Designing, Developing and Implementing Data Warehouses/Data Marts with experience of both Relational & Multidimensional database design.
- Data modeling experience in designing and implementing Data Mart and Data Warehouse applications using ERwin.
- Worked extensively on forward and reverse engineering process. Created DDL scripts for implementing Data Modeling.
- Created ERwin reports in HTML, RTF format depending upon the requirement, created naming convention files, coordinated with DBAs to apply the data model changes.
- Extensive knowledge on Slowly Changing Dimensions (SCD's) and implemented Type1, Type 2, and Type 3 to keep track of historical data.
- Created Source & Target mapping documents, Process Flow Documentation for the Informatica work flow process.
- Experience in building UNIX shell scripting to run ETL jobs and automated the entire processes.
- Have used scheduling tools like Control-M, Informatica Scheduler and EFS.
- Experience in RDBMS with good programming knowledge in Oracle 10g, SQL Server 2008 R2/ 2005, DB2, Teradata, SQL, PL/SQL, DB Triggers, Views, Stored Procedures, Functions and Packages.
- Worked as support coordinator in Incident Management and Problem Management areas.
- Good working knowledge in Production support (24X7) environment with tools like BMC remedy.
- Ability to learn very quickly and apply new skills to existing problems.
- Good Coordination skills related to management of the offshore resources.
- Team player with good communication skills, written skills, technical documentation skills and also a self-motivated individual with exemplary analytical and problem solving skills.
TECHNICAL SKILLS
ETL Tools: SSIS 2012/2008 R2/2005, Informatica 9.0.1 HotFix 2/ 8.6 (Power Center & Informatica Data Quality).
BI Tools: IBM Cognos 10, Business Objects, Crystal Reports, SSRS 2012/2008, Qlikview, Power BI, Power Map, Power Pivot, Power View.
OLAP: SSAS 2012/2008 R2, Cognos TM1.
Modeling Tool: ERwin, MS Visio
Scheduling tool: SQL Server Agent, Control M, Informatica Scheduler.
Databases: SQL Server 2012/ 2008 R2 /2005 Microsoft Azure, Oracle 10g, DB2, Tearadata, MS Access, PeopleSoft App Designer.
Operating Systems: UNIX (Linux, AIX, HP-UX, Solaris), MS DOS and Windows NT/XP.
Languages: PL/SQL, T-SQL (DDL, DML), VB.NET, Excel VBA, XML, ABAP,DAX,MDX.
Web Server: IIS 6.0.
Database Tools: Toad, MS SQL Server Management Studio, Oracle SQL Developer.
Web Technologies: ASP.NET 2.0, XML Web Services, PHP.
Version and Source Control: TFS, Visual Source Safe (VSS 6.0), Subversion.
Support tool: SysAid, BMC Remedy, Clear case
PROFESSIONAL EXPERIENCE
Confidential, Bentonville, AR
Delivery & Support Lead
Responsibilities:
- Worked closely with the end users in writing the functional specifications based on the business needs.
- Responsible for designing and developing entire solution using 2012 MS Data tools (SSIS, SSAS, and SSRS).
- Create, manage SQL Server AZURE Databases IaaS .
- Setting up Connection Strings and connecting SQL Server AZURE Databases from locally Installed SQL Server Management Studio(SSMS).
- Provide Operational Support to modify existing Tabular SSAS models to satisfy new business requirements.
- Use DAX (Data Analysis Expressions) functions for the creation of calculations and measures in the Tabular Models.
- Created ETL - SSIS jobs using Visual Studio 2010 to extract data from ERP Data source
- Worked with complex SQL, Stored Procedures, Triggers and packages in very large databases from various servers.
- Worked on Power BI and prepared dashboards using SSRS to make it more attractive.
- Created and Documented ETL Test Plans, Test Cases, Test Scripts, Expected Results, Assumptions and Validations.
- Deployment of Developed components like SSIS Packages, SQL Codes in Test, QA and Production environments.
Technologies: SQL Server 2012\2008 Microsoft Azure, SSIS, SSAS, SSRS, TFS, Power BI, Power Pivot, Power View
Confidential, Libertyville, IL
Delivery & Support Lead
Responsibilities:
- Worked in Agile Scrum Methodology with daily stand up meetings. Break down the project into various individual tasks.
- Participated & Designed Data Marts which helped to create analytical reporting for SO,PO and Inventory
- Created ETL packages using SSIS to extract data from heterogeneous database and then transform and load into the data mart.
- Analyzed and created Multidimensional Cubes OLAP with Facts and Dimension Tables using SSAS for Inventory Aging, SO & PO.
- Created a number of cubes, dimensions and business critical KPIs using SQL Server Analysis Services representing aggregations in several different ways - hierarchically and using custom groupings that the company will use to analyze performance. Created several MDX queries according to business requirements.
- Design and develop metadata model in Cognos Framework Manager & build Reports thru Reporting Studio.
- Lead the onsite and offshore teams towards successful delivery of the projects. Coordinating & monitoring the offshore development team as per the Business requirements.
- Reviewed Business Requirement Document (TM1-BRD) for Completeness, Gather and analyze financial planning, budgeting and reporting requirements.
- Development of TM1 Turbo Integrator(TI) Processes to load and retrieve data from database tables and other data sources, for creating dimensions, hierarchies, and cubes.
- Used TM1 Rules editor to create and edit rule to create various calculations.
- Worked on TM1 Architect for maintenance of TM1 users, roles, groups and security.
- Training users on how to use\install Cognos TM1 Perspectives add-in for Microsoft Excel
- Involved in upgrading the databases and SSIS packages of SQL Server 2008R2 to SQL Server 2012.
- Developed 2012 Tabular Models in SSAS and created Power BI reports.
Technologies: SQL Server 2012\2008, SSIS, SSAS, Cognos Reports, Cognos TM1, Excel VBA, Qlikview, TFS. Power BI
Confidential, NYC NY
Data Migration Developer
Responsibilities:
- Analysis of Source data at EAS application side, Verify the accuracy of fields data by participating meetings with SMEs.
- Identify the target fields where the data will migrate to and prepared mapping document.
- Define conversion rules Business Logic, validation and cleansing process for the fields.
- Identified the data dependency and set sequence in Informatica ETL sessions.
- Execute the Informatica jobs to perform Full Load and Delta Load of the source data.
- Prepared Reconciliation report in SSRS 2008 R 2 in the form of Excel document, which will be the base for validating data from application view.
Technologies: Informatica 9.0.1 HotFix 2, SQL Server 2008 R2 DB & SSRS and SQL Server 2000.
Confidential, Confidential NY
BI Developer
Responsibilities:
- To comply with developer responsibilities of Confidential ’s internal SDLC framework; prepared technical design documentation, test cases and co-coordinated with users to fulfill UAT.
- Created Oracle tables in dev environment with PeopleSoft App Designer.
- Developed Source to Target Matrix which contains the transformation logic.
- Used Star Schema methodology in building and designing the logical data model, and implemented SCD2 dimensional models.
- Extensively used Transformations like Router, Aggregator, Normalizer, Joiner, Expression, Lookup, Update strategy and Sequence generator.
- Developed complex Informatica Mappings, tasks, sessions and Workflows for Weekly and Monthly process to loading heterogeneous data into the data warehouse.
- Build Complex Reports in Cognos 8.4 like Drill-Through and Ad-hoc reports.
- Created Query Prompts, Calculations, Conditions and Filters in the Cognos reports.
- Designed Microsoft Excel 2007 ad-hoc report using VBA macro code.
- Scheduled Informatica Workflows & Cognos Reports with Control M.
- Support application with the help of CLEAR CASE tool.
Technologies: Informatica 8.6, Cognos 8.4, PeopleSoft-Oracle 10g, PeopleSoft App Designer, Oracle Developer, UNIX, Control M.
Confidential, Plano TX
Delivery & Support Lead
Responsibilities:
- Involved in the entire SDLC process that includes design, implementation, testing (Unit, Integration & UAT), deployment and maintenance.
- Responsible for gathering the business requirements from the end user and analyzed it to provide business intelligence solution.
- Categorize Dimension & Fact tables by interviewing Oracle functional experts & business analysts.
- Extensively used both Star Schema and Snow flake schema methodologies in building and designing the logical data model in both SCD1 and SCD2 dimensional models.
- Designed the conceptual, logical and physical model according to the requirements using ERwin.
- Developed Source to Target Matrix which contains the transformation logic.
- Performed extensive Data Profiling that need to be brought into Operational Data System (ODS) from the invoice processing systems.
- Developed Informatica Mappings, Mapplets, reusable Transformations, tasks, sessions and Workflows
- Weekly and Monthly process to loading heterogeneous data into the data warehouse. Source files include VSAM, delimited flat files, and Oracle tables.
- Obfuscation techniques like data masking & encryption have been implemented, as this project is dealing with sensitive and secured financial information.
- Build complex Queries, PL/SQL stored procedures and Oracle Packages to utilize in Informatica mappings.
- Leveraged Teradata command utility BTEQ to query the data mart tables.
- Integrated Informatica workflows with ESP scheduler (Mainframe based) with help of UNIX shell scripts.
- In Cognos 8.1, developed Complex Reports like List, Cross tab, Drill-Through and various Charts.
- Managed user groups to access the Cognos reports through Cognos Connection.
- By using BMC Remedy, meet the designated SLAs on incidents and maintaining optimum service delivery.
- Resolving the problems from identification of root causes (RCA) for the major incidents.
- Coordinating with offshore team and making sure the support should happen 24X7.
Technologies: ERwin, Informatica 8.6, IDQ, Cognos 8.1, SAP MM,Oracle Apps R 12 with Oracle 10g, Tearadata V2R6, BTEQ, DB2, Toad, UNIX AIX, ESP, BMC Remedy.
Confidential, Irving TX
BI Developer
Responsibilities:
- Requirement Analysis, Preparing High Level and Low Level Design document.
- Build reporting models in MS Power Point.
- Developing intricate reports with SQL Server Reporting Services (SSRS).
- Build drill down reports on MDR hierarchy cubes.
- Prepared complex SQL queries with stored procedures, Functions and Views in MS SQL 2005.
- Extensively worked on installation & configuration of Share Point Portal 2007. Documented the steps of installation and configuration of SPP 2007 server.
- Embedding MOSS 2007 Excel Service with VZ.AI Tool (Using iFrame).
- Handled user management in SPP, creation of group level user permissions to view reports based on business lines.
- Created user manuals on reports usage.
Technologies: SQL Server 2005, SSRS, SSAS, SSIS, Visual Source Safe, Share point 2007, MOSS 2007.
Confidential, Raleigh NC
BI Consultant and Onsite Coordinator
Responsibilities:
- Providing complete end to end BI solution.
- Requirement Analysis, Preparing High Level and Low Level Design document and Data Validation and Test Plans.
- Identified Multiple Dimensions and Fact tables based on Booking, Billing, Backlog and Shipping subject areas.
- Used advance data modeling concepts of Degenerated dimension, Sub-Dimension, Fact less fact table, Aggregate fact tables in Multi dimensional model.
- Worked with data modelers in preparing logical and physical data models and adding/deleting necessary fields using ERwin.
- Created ERwin reports in HTML, RTF formats made them available online for Business mangers, BI Architects.
- Developing the mappings and workflows with Informatica Power center for Data warehouse build.
- Extensively used Transformations like Router, Aggregator, Normalizer, Joiner, Expression, Lookup, Update strategy and Sequence generator, Stored Procedure.
- Analyzed data quality issues and generation of data quality reports with help of Informatica Data Quality (IDQ).
- Used the command line program pmcmd to run Informatica jobs from command line.
- And used these commands in shell scripts to create, schedule and control workflows, tasks and sessions.
- In Cognos Report Studio created Master-detail, Drill-through and complex reports with respect to rendering the reports in Excel and PDF formats.
- Carried out Bursting of reports in Cognos Report Studio.
- Configured/handled Cognos Connection to Manage users & Reports.
- Actively involved in Unit Testing and Performance Tuning of Informatica mappings.
- Documenting mappings of source system to target system data structure in the data warehouse.
- Writing Application Usage Guidance Document for Business users.
Technologies: Informatica 7.x, Cognos 8.1, Oracle 9i and UNIX.
Confidential, San Antonio TX
Data Warehouse Module Lead
Responsibilities:
- Involved in Design and Source to target preparation by understanding legacy systems.
- Participated in identifying data mart tables with Sr. Data Modelers.
- Documented the data mart design information with ERwin.
- Developed ETL mappings with Informatica to build Data marts.
- Created transformations in Informatica like Aggregate, Expression, Filter, Sequence Generator, Joiner, and Stored procedure transformations
- Involved in the process design documentation of the Data Warehouse Dimensional Upgrades.
- Extensively used Informatica for loading the historical data from various tables for different departments.
- Scheduled the Informatica workflows with Control-M and Creating the ETL run book.
- Conducted Code sessions and walk-through of the developed mappings
Technologies: Informatica 7.x, IDQ, DB2, ERwin, Control-M and UNIX Shell Script.
Confidential, San Antonio TX
Data Warehouse Developer
Responsibilities:
- Involved in Design and Source to target preparation by understanding legacy systems.
- Developed ETL mappings with Informatica to build Data Warehouse.
- Testing and debugging the mappings in with informatica Debugger as well Data validation testing.
- Scheduled the Informatica workflows with Control-M.
- Created MS Excel VBA Reports over reconciliation data, to validate source records against target records.
- Responsible for identifying the missed records in different stages from source to target and resolving the issue.
- Conducted data reconciliation walks through with FIRS system experts.
- Performed unit testing and involved in UAT testing conducted by FIRS system experts.
Technologies: INFORMATICA Power Center 7.x, Oracle, DB2, UNIX Shell Script, Excel VBA Reports.
Confidential, San Antonio TX
Data Warehouse Developer
Responsibilities:
- Gathering requirements for the project and business community for the Property System.
- Interviewed and conducted meetings with Business Managers & Business experts who are currently using Legacy property applications.
- Participated designing of Staging Data Source (SDS) and Data marts with the line of businesses Policy and Claims.
- Prepared software requirement specifications, business rules interacting with business units and designed Star schema, logical and physical database design for Domains.
- Designed ER diagrams, logical model (relationship, cardinality, attributes, and, candidate keys) and physical database (capacity planning, object creation and aggregation strategies) as per business requirements.
- Implemented Extraction of data from multiple operational systems (like policy, claims systems).
- Developed Complex transformations, Mapplets using Informatica to Extract, Transform and load data into Operational data store (ODS) and Data marts.
- As part of loading dimension tables created SCD Type1 and SCD Type2 mappings.
- Executed sessions, both sequential and concurrent for efficient execution of mappings and used other tasks like event wait, event raise, email and command.
- Integrating with the data sources (like claims data with policy). Comprehensively used Informatica for loading the historical data from various tables for different departments.
- Created and Documented ETL Test Plans, Test Cases, Test Scripts, Expected Results, Assumptions and Validations.
- Extracted data from IMS VSAM files, Flat Files and populated into SDS.
- Responsibilities include creating the workflows & sessions and scheduling the workflows with Control-M.
Technologies: Informatica Power Center 7.x, IMS, Oracle, DB2, MS Visio and UNIX Shell Script.
Confidential, Honolulu HI
DW Application Developer
Responsibilities:
- Worked closely with the end users in writing the functional specifications based on the business needs.
- Analyzing the different source systems and documenting the same.
- Participated in data modeling (logical and physical) for designing the Data Marts.
- Preparing Analysis and Design Documents for ETL and Shell Scripts.
- Developed standards and procedures for transformation of data as it moves from source systems to the data mart.
- Developing the code which includes Informatica coding, developing the Control-M Scripts for scheduling the jobs and developing the shell scripts.
- With PL/SQL Procedures/Functions to build business rules to load data
- Created and Documented ETL Test Plans, Test Cases, Test Scripts, Expected Results, Assumptions and Validations. Conduct quality reviews of code.
- Analysis of requirements for finding any ambiguity, incompleteness or incorrectness.
- Client communication.
- Data size estimation.
- Manage & coordinating the offshore team.
- Involved in resolving post production data issues.
- Coordinating the team to avoid slippage in deliverables.
- Mentoring new resources on Informatica.
Technologies: Informatica 7.x, UNIX Shell Scripting, SQL Server 2005.
Confidential, Richardson, TX
ETL Developer
Responsibilities:
- Daily data warehouse load monitoring for which data is feed from 65 sources through Informatica workflows.
- Resolved Crystal Report related issues.
- Resolved application related issues as per support.
- Resolving database related issues and provide 24 by 7 support
- Worked on work orders which comprises of code changes and enhancements.
Technologies: Informatica 7.x, Crystal, Oracle 8.1, SQL Server 2000, TOAD, SQL Server Management studio.
Confidential
ETL Developer
Responsibilities:
- Analyzing the business requirements.
- Coordinating with source system SCR on data related issues, defining the process on incoming data file properties, naming conventions, frequency etc.
- Building Informatica mappings to extract data from source systems to various layers of KDW.
- Created key Crystal reports to view data from the KDW.
- Coordinating the code migration activities from Dev to QA and QA to Prod.
- Coordinating the testing activities like unit testing and integration testing.
- Testing the consistency and quality of data that is loaded into warehousing.
- Involved in resolving post production data issues.
Technologies: Informatica 6.x, Crystal Reports, DB2, UNIX