Team Lead/senior Developer Resume Profile
Detroit, MI
PROFESSIONAL PROFILE
- Over 12 years of IT experience exclusively in Managing Projects Data Warehousing and Business Intelligence concepts, Design principles and Software Architecture.
- Worked as Team/ Technical lead along with Hands-On Development Experience
- Experience in succesfully managing Program s / with teams of more than 10 resources.
- Certified in Lean Project Delivery
- Certified in Informatica Designer Module.
- Responsible for interacting with Business Counterparts / Vendors /Customers to identify information needs and business requirements.
- Responsible as a Mentor in training newly hired Employees / Contractors with the knowledge of Business and the Data flow concepts.
- Expertise in OLTP/OLAP System Study, Analysis and E-R Data modeling, developing Database Schemas like Star Schema and Snowflake Schema used in Relational, Dimensional and Multi-Dimensional modeling.
- Experience in performing each phase of SDLC like Requirement Gathering, Analysis, Planning and Designing, Development, Testing, Implementation, Maintenance and Enhancements making use of ETL tool Informatica Power Center.
- Experience in the full life cycle development of Slowly Changing Dimensions SCD and Change Data Capture CDC for Data Marts.
- Extensive Knowledge in designing Functional Detailed Design Documents for data warehouse development.
- Experience in creating and maintaining Project Plans, Work Plans Work Breakdown Structures WBS , Capacity Plans.
- Extensive worked on Informatica PowerCenter
- Knowledge on Informatica Power Exchange to import sources from external systems like for instance Mainframe IMS, DB2, and VSAM or ERP.
- Strong Data Analysis and Data Profiling background using MS Access, Informatica Data Explorer IDE .
- Experience on Sales Performance Management tool - TrueCOMP 5.1.2.3 working on the Integration side.
- Experience in writing Stored Procedures and Functions PL/SQL, T-SQL .
- Worked with different data sources like Relational Databases Oracle, SQL Server, Teradata, and DB2 , Files Flat Files, XML Files, VSAM and Mainframe systems.
- Reporting tool using Cognos reporting tool Transformer, Impromptu, PowerPlay , Seagate Crystal Reports Also, familiar with Business Objects.
- Developed Test Plans, Executing Test Cases.
- Skilled in Unix Scripting, knowledge in Perl Scripting.
- A motivated self-starter, with excellent communication, interpersonal and analytical skills.
TECHNICAL SKILLS:
ETL Tools | Informatica PowerCenter 9.5.1/8.6.1/8.x/7.x/6.x, Sales Performance Management tool, TrueCOMP 5.1.2.3, Pentaho Data Integration PDI 4.3, Pentaho Business Analytics 3.9, Data Stage 7.5, SSIS 2005, Informatica , PowerExchange Navigator 9.5.1 |
Data Modeling | Visio, ERWin |
Databases | Oracle 8i/9i/10g/11g, SQL Server 2000/2005/2008, Teradata v2R4/R6, DB2, MS Access |
Front End Utilities | TOAD, SQL Query Analyzer, Oracle SQL Developer 1.1.2,Teradata SQL Assistant 7.1.0, SQL Plus |
Operating Systems | Windows 95/98/00/NT/XP, AIX, HP-UNIX, SUN-OS, MS-DOS, MVS |
External Scheduling Tools | Tivoli Work Scheduler TWS 8.4/8.3 |
Packages | Microsoft Office Access, Excel, Project, Visio, Word , XMLspy, Ultra-edit |
Languages | C/C , Java, JavaScript, HTML, XML, VB, COBOL, SQL, PL/SQL, T-SQL |
Reporting Tool | Cognos 10.2, Business Object , Crystal reports |
Other | SCM Versioning Tool, PVCS Versioning, Clear Quest Change Request Management Tool and Mercury Quality Center 9.0 Test Director , Data Explorer IDE , Data Quality IDQ , Microsoft Project 2007 |
PROFESSIONAL EXPERIENCE
Confidential
Team Lead/Senior Developer
- Team Lead, leading a team of 10 resources
- Supporting the production support for IM initiative along with Hands On development
- Resolve Production Issues and providing quick resolutions.
- Changing production support Process in order to efficiently provide resolution
- Made sure that all the jobs are completed per the SLA requirements
- Performance tuning of long running jobs
- Worked as a Subject Matter Expertise SME on most of the Subject Areas and downstream Applications.
- Worked with different processes of a Project like Initiating, Planning, Executing, Monitoring Controlling and Project Closure.
- Created the Conceptual Solution and worked on the Estimates and Capacity Planning.
- Developed Financial Forecasts / Budgets for various Projects.
- Managed Cross-Functional Technical resources according to the Project needs.
- Lead multiple projects implemented through different Vendor Company workforce Accenture, Cognizant, Galaxe, Compuware .
- Lead DQ effort to resolve the Data Quality Issues
- Worked on different phases of SDLC starting from Requirement Gathering to Post-Implementation.
- Created the Design Documents High Level, Technical Design and STM specs and provided to different Development teams under different Programs National Health Reform, Exchange, RMRA, PPQ, Book of Business .
- Created the Detail Work Plan and Project Work Plan in order to coordinate concurrent Development through different initiatives.
- Identified the Risks / Issues in the early Phases making use of Daily Scrum Master meetings.
- Provided Weekly Project Status reports to Leadership.
- Lead a Matrix project team to define, establish and ensure completion of all Program s / Project s Milestones while adhering to Timelines and Budgets.
- Worked on building different Data Marts in de-normalized fashion to support the outbound feeds and Cognos Framework Reports.
- Performed the Code Inspection / Review for the Coding developed by the ETL Developers to make sure the standards and the best practices are followed.
- Handled Programs and Projects to be completed on time on budget OTOB .
- Lead the activities, growth and professional development for three staff members.
- Completed the BCBS Industrial Certifications.
- Automated the processes from existing Manual processes.
- Analyzed the existing systems and made a Feasibility Study.
- Project life cycle - from analysis to production implementation, with emphasis on identifying the source and source data validation, developing particular logic and transformation as per the requirement and creating mappings and loading the data into different targets.
- Designed, developed Informatica mappings/mapplets, enabling the extract, transform and loading and data cleansing of the data into target tables from various source systems like mainframe, Flat Files Variable fixed length , VSAM, IMS, DB2 mainframe, SQL Server
- Worked with Memory cache for static and dynamic cache for the better throughput of sessions containing Rank, Lookup, Joiner, Sorter and Aggregator transformations.
- Supporting the Informatica Administration activities involving creating and managing users, creating repositories, code migration from one environment to other etc.
- Used PowerExchange 9.1 to connect, extract and transfer mainframe files.
- Responsible for the daily weekly monthly yearly loads and handling the reject data.
- Extensively worked in the performance tuning of the programs, ETL Procedures and processes and used Informatica Debugger to validate mappings and to gain troubleshooting information about data and error conditions.
- Involved in fixing invalid mappings, testing of Stored Procedures and Functions, Unit and Integrating testing of Informatica Sessions, Batches and the Target Data.
- Resolved performance issues by doing test runs and incorporating tuning measures like joining tables in database using pre-session SQL rather than Joiner, keeping the Filter as close as possible to the source, using unconnected lookup, optimized SQL overrides, increasing target commit intervals etc.
- Designed and developed UNIX shell scripts as part of the ETL process to compare control totals, automate the process of loading, pulling and pushing data from to different servers.
- Used Tivoli to schedule the workflow process
- Performed Unit testing, string testing, regression testing, smoke testing and system integration testing.
- Create test cases and test the transformation/jobs.
- Developed Slowly Changing Dimension type I, II and III.
- Maintained naming standards and warehouse standards for future application development.
- Incorporated the Email feature to send Reports to the clients on job completion.
- Data Quality Analysis
- Scheduled and monitored job batches.
- Created documentation on the design, development, implementation, daily loads and process flow of the jobs.
- Translated high level design documents to low level ETL design documents.
Environments:
Informatica Power Center 9.1.1, 8.6.1, 8.1.1, 7.1.3 Repository Manager, Designer, Workflow Manager, Workflow Monitor ,, Informatica PowerExchange Navigator 8.1.1, Oracle 10g/9i, MS SQL Server 2005, PL/SQL, DB2, SQL Developer 1.1.2, Toad 8.6.1, Putty, Shell Scripting, UNIX, Windows XP, Clear Quest, MS Access, Cognos 8.3, Tivoli Work Scheduler TWS 8.4/8.3, SCM Versioning Tool, Data Explorer IDE 5.0, SQL Enterprise Manager 8.0, SQL Query Analyzer 8.0, PVCS,
Confidential
TrueCOMP Integration Developer
- Coordinate with the Callidus team on the integration side of the design and creating design-mapping documents.
- Wrote scripts for the automation of the batch processes executed daily/monthly/yearly basis.
- Support the Integration during SIT/UAT and support the testing team on the creation of test scripts.
- Providing support to the business users on the truecomp manager and resolving queries related to any anomalies found.
- Responsible for code migration/maintenance/version control
- Provided production support on execution of the batches and was the primary contact for any postproduction issues coming and resolving the same.
- Working on the Interface, developing Informatica mappings for loading data from source systems into stage/ODS tables
- Provide training to team members on truecomp architecture and support.
Environment: Oracle9i, TOAD 7.4.0.3, Informatica PowerCenter 7.1.1/8.1.1, PL/SQL, TrueCOMP 5.1.2.3, Mercury Quality Center 9.0 Testdirector , Shell Script.
Confidential
Data Warehouse, Analyst/ Developer
- Informatica extract and interface design, construction, testing and implementation of appropriate Informatica Power Center objects to address the interface requirements.
- Data Architecture Design to ensure all information necessary to fulfill interface requirements are designed into the Informatica Staging area and to ensure all information necessary to support integrated reporting requirements is designed into a report staging area to facilitate the Business Objects Universe design.
- User Acceptance Testing support to assist with the development of user acceptance test plans through the tracking to a requirements traceability matrix
- Participated in gathering business requirements and business analysis by attending business meetings, discussing the issues to be resolved. Translating user inputs into ETL design documents.
- Transform the business requirements to system/ data requirements by preparing functional and technical specifications documents.
- Extensively used ERWin for Conceptual, Logical and Physical data modeling and designed Star Schemas
- Performed Source to Target mapping by developing complex mappings in Informatica to load data extracted from various sources using different transformations like Union, Source Qualifier, Look-up connected and Unconnected , Expression, Aggregate, Update Strategy, Sequence Generator, Joiner, Filter, Rank and Router Transformations.
- Import data from Mainframe system using Power Exchange
- Used Debugger to test the mappings and fixed the bugs.
- Developed new and modified existing Informatica mappings and workflows based on specification.
- Created and configured Workflows and Sessions to load data from Staging to Target database tables using Informatica Workflow Manager.
- Worked with SCD tables such as Type 1, Type 2.
- Resolve Data Quality Issues
- Used Tivoli for scheduling of data extraction, transformation and loading jobs related to different systems.
- Performance tuning of sources, targets, mappings, transformations and sessions to optimize session performance.
- Worked on performance tuning of databases dropping rebuilding indexes, stats collection on tables
- Worked with Informatica Version Controlling for Team Based Development to check-in and check-out mappings, objects, sources, targets, workflows etc.
- Loaded Data into Teradata target tables using Teradata utilities Fast Load, Multi Load, and Fast Export . Queried the Target database using Teradata SQL and BTEQ for validation.
- Performed Unit Testing, Integration Testing and User Acceptance Testing to pro-actively identify data discrepancies and inaccuracies.
- Accomplished the SDLC methodology and advocated for Agile approach.
- Performed Gap and Impact Analysis for the existing OLTP and DWH instances.
- Converted the Stored Procedures into the ETL jobs making use of the BI-ETL tool Pentaho Data Integration PDI .
- Extracted the data from different source systems like Oracle, SQL Server and integrated the data into the OLAP DWH which is considered as the single source of truth.
- Applied various steps and transformations while working with the PDI tool.
- Cleansed the data and applied business rules with PDI tool.
- Normalized and de-normalized the data making use of Pentaho steps.
- Executed the Pentaho jobs on both Client and Server end.
- Generated reports and created dashboards with Pentaho Business Analytics.
- Performed Unit Testing and created the Unit Test case documents.
- Migrated the code from Dev to QA and to PROD repositories by exporting as Pentaho XML files.
- Automated the jobs using third party tool Tidal.
Environment:
Informatica Power Center 6.1., Tivoli, AIX, Mainframe, Oracle, Teradata v2r4, DB2, SQL, PL/SQL, TOAD, Pentaho Data Integration PDI 3.1, SQL Server, UNIX Shell Scripts.