We provide IT Staff Augmentation Services!

Development Lead & Production Support Resume

San Jose, CA

SUMMARY

  • Around Seven years of working experience in all the phases of the Data warehouse life cycle involving design, development, analysis & testing of Data warehouses using ETL, Data Modeling, Online Analytical Processing & reporting tools
  • Mastery over the ETL process with many years of experience in extraction of the data from various source systems ranging from various Main frames like DB2, TeraData etc to RDBMS like Oracle, SQL Server etc Transformation using various kinds of transformations and Loading the transformed data into the Data Warehouse followed by the analysis of the loaded data with the help of OLAP tools
  • Practical understanding of the Data modeling (Dimensional & Relational) concepts like Star - Schema Modeling, Snowflakes Modeling, fact and dimension tables, Pivot Tables, modeling of data at all the three levels: view, logical & physical
  • Responsible for all activities related to the development, implementation, administration and support of ETL processes for large-scale data warehouses, ETL procedure development to load high-volume warehouse using various tools like PowerCenter, PowerMart.
  • Strong working experience with various Online Analytical Processing tools (OLAP) such as Cognos Impromptu, Cognos Power Play & Business Objects.
  • Used various report gathering tools to query the information from the Data Marts and format them using advanced report formatting techniques like visualization of data, navigation, and analysis
  • Over Four years of experience in SQL, PL/SQL, Procedures/Functions, Stored procedures development, Triggers and Packages experience, Index design and tuning, Database security design and implementation and more than two years as Oracle 7x, 8i / 8x of Database designer experience
  • Strong experience in Unix working environment, using the VI editor, writing shell scripts etc

TECHNICAL SKILLS:

ETL Tools: Informatica 4.7/5.1/6.0/6.1/6.2/7.1/7.2 (Source Analyzer, Warehouse designer, Mapping Designer, Transformation Developer, Mapplet Designer, Workflow Manager/Server Manager, Workflow Monitor), SQL*Loader, Informatica PowerCenter/PowerMart/PowerConnect/PowerAnalyzer.

OLAP & Reporting Tools: Siebel Analytics 7.8, Cognos Impromptu & Power Play, Business Objects, Crystal Reports, Oracle Reports, Brio 6.1, Hyperion

Data/Object Modeling Tools: Oracle Designer, Erwin 4.0, Rational Rose

RDBMS: Oracle 7.x/8.x/8i/9i, SQL* Navigator, SYBASE, Teradata, DB2, MS Access 2000, SQL Server 2000

Operating Systems: MS Windows 98/NT/2000, Unix, Red Hat Linux 7.1

Languages: Java, C++, C, Pascal, Scheme, Prolog, Visual Basic, UNIX Shell Scripting, SQL, SQL*Plus, PL/SQL, Intel 8085/8086 Assembly

Internet Technologies: XML, HTML, DHTML, XSLT, UML, ASP, JavaScript, Tcl/Tk, Perl

Others: JUnit, Ant 1.3, JTest, Jester, Jmutator1.3, Visual Studio 6.0, CorelDraw 10, Adobe Photoshop 6.0, MS Office 2000, etc

PROFESSIONAL EXPERIENCE

Confidential, San Jose, CA

Development Lead & Production Support

Responsibilities:

  • Worked with the Business analysts of the QDM data mart team in identifying the requirements for the data mart, SLA of the source system and the complete set of requirements to pull the data
  • Identified all the conformed dimensions to be included in the target warehouse design and confirmed the granularity of the facts in the fact tables, identifying and enforcing primary and foreign key relationships of the star schema.
  • Lead the development process by guiding a team of three junior developers, developed mapping specification documents based on the requirements from the business users and been solely responsible for the ETL development
  • Designed & Developed complex mappings, workflows and stored procedures to meet the business requirements
  • Defined the program specifications for the data migration programs, as well as the necessary test plans used to ensure the successful execution of the data loading processes.
  • Assisted the junior developers in debugging mappings, solving tough mapping bugs, guiding them with migrations and other coding related issues
  • Preparing mapping design documents to hand over to other developers, documentation of the coding structures and efficient practices
  • Setting up the Informatica environment, by having the various environments, DEV, TEST, PRE-PRODUCTION & PRODUCTION. Creating folders for various developers and managing the security issues through proper access permissions
  • Scheduling the jobs via CRON Tab in Unix, writing various shell scripts that use PMCMD commands to call the Informatica workflows and stored procedures basing on a job dependency table
  • Responsible for monitoring all the workflows that are scheduled, running completed and using an e-mail task to indicate their status (not just succeeded or failed but including source/target rows failed etc)
  • Tuning the Oracle PL/SQL code by adding indexes, partitioning tables, dropping indexes on huge tables before load and rebuilding later, identifying top 10 SQLs and tuning them on a regular basis
  • Tuning the Informatica mappings by tracking the performance of the workflows using .perf file, target load statistics, tracing level, replacing complex SQ joins by joiner transformations with ample cache
  • Creating reports using Seibel Answers and changing Siebel RPD for any RPD related changes
  • Analysis of issues related to the Siebel dashboards by generating the query and checking the results on the backend and correcting the reports and RPD as needed
  • Complete Production Support and handling of issues with Informatica server going down and Seibel reports, handling of cases by business users, monitoring ETL jobs, making sure entire ETL process and support processes are documented properly, handling security issues related to other teams accessing our data by creating appropriate user security views etc
  • Handled Onshore Off shore Coordination with a team back in India supporting the application during night times.
  • Handling the alliance cases raised by business users for issues regarding Siebel dashboards and resolving them effectively by changing the back end code and RPD as necessary
  • Identifying code which can further be enhanced to improve the overall scheme of things and handling them as enhancements of the code by assigning appropriate development resources

Environment: Informatica PowerCenter/Powermart 7.1, 7.2, Seibel Analytics, Oracle 10 g, Dollar Universe Scheduling tool, Unix (IBM - AIX), and Windows XP.

Confidential, Sanfrancisco, CA

Sr. Data Warehouse Developer

Responsibilities:

  • Worked with the SPA securites & lending staff & Business analysts for requirements gathering, business analysis, testing, and metrics and project coordination.
  • Solely responsible for the entire ETL process, desgin of the mapping structures, working along with 2 other junior developers, co-ordinating between the team lead, business users and the team
  • Identified all the conformed dimensions to be included in the target warehouse design and confirmed the granularity of the facts in the fact tables.
  • Desgined & Developed complex mappings, workflows and stored procedures to meet the business requirements
  • Co-ordinated huge data loads by dropping & recreating of indexes, partitioning of mapping pipelines, using flat files & external loaders instead of writing to relational targets
  • Preparing mapping design documents to hand over to other developers, documentation of the coding structures and efficient practices
  • Converting into Stored procedures and shell scripts certain mapping logic and calling them through SP transformation or command task respectively
  • Identified and tracked the slowly changing dimensions/mini dimensions, heterogeneous Sources and determined the hierarchies in dimensions.
  • Implemented various integrity constraints for data integrity like Referential integrity using primary key and foreign keys relationships.
  • Scheduling the workflows using a status table with a record for each workflow, a job dependency table that holds the dependencies between workflows
  • Identifying the arrival of the nightly flat files using the .END file and kicking off the respective workflows making sure the overall load on the server is balanced
  • Tracking the performance of the workflows using .perf file, target load statistics, tracing level, replacing complex SQ joins by joiner transformations with ample cache
  • Responsible for monitoring all the workflows that are scheduled, running completed and using an e-mail task to indicate their status (not just succeeded or failed but including source/target rows failed etc)
  • Worked on database connections, SQL joins, cardinalities, loops, aliases, views, aggregate conditions, parsing of objects and hierarchies.
  • Defined the program specifications for the data migration programs, as well as the necessary test plans used to ensure the successful execution of the data loading processes.
  • Created repository, users, groups and their privileges using Informatica Repository Manager
  • Involved in writing UNIX shell scripts for Informatica ETL tool to run the Sessions.
  • Connection management and scheduling of jobs to be run in the batch process

Environment: Informatica PowerCenter/Powermart 7.1, 7.2, Business Objects, SYBASE ASE 12.5.2, SYBASE IQ, SYBASE Central, Ineractive SQL, Embarcadero Rapid SQL, Unix (IBM - AIX), and Windows XP.

Confidential, Sanfrancisco, CA

Sr. Data Warehouse Developer

Responsibilities:

  • Worked with the PCS financial services staff and Business analysts for requirements gathering, business analysis, testing, and metrics and project coordination.
  • Designed the Dimensional model of the Data warehouse and used Erwin 4.0 to design the business process, grain of the data representation, dimensions and fact tables with measured facts.
  • Identified all the conformed dimensions to be included in the target warehouse design and confirmed the granularity of the facts in the fact tables.
  • Created various data marts for both kinds of applications, Credit-cards and Financial and used effective querying and formatting tools to present the data to the end users.
  • Worked with huge tables that consisted of around 256 columns and 100 million records per each table.
  • Used complex data transformations with around 25 transformations for each mapping
  • Improved the performance of the mappings by moving the filter transformations early into the transformation pipeline, performing the filtering at Source Qualifier for relational databases and selecting the table with fewer rows as master table in joiner transformations.
  • Identified and tracked the slowly changing dimensions/mini dimensions, heterogeneous Sources and determined the hierarchies in dimensions.
  • Implemented various integrity constraints for data integrity like Referential integrity using primary key and foreign keys relationships.
  • Used task developer in the Workflow manager to define sessions
  • Created reusable worklets and mapplets and transformations.
  • Involved in writing shell scripts on Unix (IBM - AIX) for Informatica ETL tool to run the Sessions.
  • Created cubes, dimensions, hierarchies, mappings and complex mapplets using Informatica PowerCenter 7.1.
  • Responsible for monitoring all the sessions that are scheduled, running completed and failed. Involved in debugging the Mappings that failed.
  • Scheduling and coordination of the Informatica tasks is done using UNIX cron and Perl scripts. Email and paging facilities are also utilized at key points in the process.
  • Writing DB2 PL/SQL procedures for processing business logic in the database. Tuning of Database queries for better performance using tools like Command Center.
  • Developed different types of Reports such as; Master/Detail, Cross Tab and Chart (for trend analysis) using Cognos Impromptu 6.0, Cognos PowerPlay 6.6
  • Used Impromptu features such as sub-reports, Drill through and charts.
  • Creation of application-specific reports like, Reports by customer, Reports by account, Reports by credit card.
  • Documented processing times for each module, developed test cases and used them to run through each process. Tuned the matching parameters based on test results. Implemented and supported the Business Intelligence environment and User Interface.

Environment:

  • Informatica PowerCenter/Powermart 7.1
  • 7.2
  • DB2
  • SQL/PLSQL
  • dynamic SQL
  • BRIO hyperion
  • CGI
  • Perl
  • UNIX Shell Programming
  • Erwin 4.0
  • Unix (IBM - AIX)
  • and Windows NT.

Confidential, San Jose, CA

Sr. Data Warehouse Developer

Responsibilities:

  • Managed the entire ETL process involving the access, manipulation, analysis, interpretation and presentation of information from both internal and secondary data sources.
  • Developed complex mappings using Informatica PowerCenter Designer to transform and load the data from various source systems into the Oracle target database.
  • Analyzed and understood all data in the source databases, participated in the knowledge transfer process and prepared the required documentation.
  • Monitored jobs on $U and overlooked over the entire process and handled failures on Informatica mappings
  • Written Oracle stored procedures for data migration, written external loader scripts, created indexes as to improve performance and scripts for dropping and recreating them as per mapping loads
  • Written unix shell scripts to manage the workflow runs, call pmcmd startworkflow and waitworkflow etc.. commands to repeatedly loop through the workflows till an end point is met.

Environment:

  • Informatica PowerCenter/Powermart 6.1
  • Business Objects
  • Oracle 9i
  • TOAD
  • SQL/PLSQL
  • Perl
  • UNIX Shell Programming
  • $U
  • Erwin 4.0
  • Unix (IBM - AIX)
  • and Windows NT.

Confidential, Oakland, CA

Sr. Data Warehouse Developer

Responsibilities:

  • Participated in the entire requirements engineering process right from requirements elicitation phase (where we gather the requirements from the retail business owners and marketing people) to the phase involving documenting of the requirements.
  • Managed the entire ETL process involving the access, manipulation, analysis, interpretation and presentation of information from both internal and secondary data sources to customers in sales area.
  • Developed complex mappings using Informatica PowerCenter Designer to transform and load the data from various source systems like Oracle, Teradata, Sybase into the Oracle target database.
  • Analyzed and understood all data in the source databases and designed the overall data architecture and all the individual data marts in the data warehouse for each of the areas Finance, Credit Cards, Brokerage.
  • Involved in the creation of oracle Tables, Table Partitions, and Indexes.
  • Implemented various integrity constraints for data integrity like Referential integrity using primary key and foreign keys relationships.
  • Handled alerting mechanisms, system utilization issues, performance statistics, capacity planning, integrity monitoring, population, maintenance, reorganization, security, and recovery of databases.
  • Worked in Off-shore On-shore Co-ordination setting, delegating and managing a group in Indias Accenture
  • Involved in quality assurance of data, automation of processes.
  • Involved in the development and testing of individual data marts, Informatica mappings and update processes.
  • Identified and tracked the slowly changing dimensions/mini dimensions, heterogeneous Sources and determined the hierarchies in dimensions.
  • Created synonyms for copies of time dimensions, used the sequence generator transformation type to create sequences for generalized dimension keys, stored procedure transformation type for encoding and decoding functions and Lookup transformation to identify slowly changing dimensions.
  • Used Task developer in the Workflow manager to define sessions
  • Assisted retailers in understanding consumer buying patterns and in creating consumer sets for marketing campaigns
  • Created application-specific Data Marts so that users can access personalized dashboards of information that is specific to their department and business unit.
  • Responsible for monitoring all the sessions that are running, scheduled, completed and failed.
  • Worked on database connections, SQL joins, cardinalities, loops, aliases, views, aggregate conditions, parsing of objects and hierarchies.
  • Defined the program specifications for the data migration programs, as well as the necessary test plans used to ensure the successful execution of the data loading processes.
  • Created repository, users, groups and their privileges using Informatica Repository Manager
  • Involved in writing UNIX shell scripts for Informatica ETL tool to run the Sessions.
  • Connection management and scheduling of jobs to be run in the batch process
  • Generated detailed reports from the data marts using Business Objects.

Environment:

  • Informatica PowerCenter/Powermart 6.1
  • 7.1 Cognos Impromptu 6.0
  • Cognos PowerPlay 6.6
  • Oracle 9i
  • DB2
  • SQL/PLSQL
  • dynamic SQL
  • CGI
  • Perl
  • UNIX Shell Programming
  • Erwin 4.0
  • Unix (IBM - AIX)
  • and Windows NT.

Confidential, Chevy Chase, MD

Sr. Data Warehouse developer

Responsibilities:

  • Participated in the entire requirements engineering process right from requirements elicitation phase (where we gather the requirements from the retail business owners and marketing people) to the phase involving documenting of the requirements.
  • Managed the entire ETL process involving the access, manipulation, analysis, interpretation and presentation of information from both internal and secondary data sources to customers in sales area.
  • Developed complex mappings using Informatica PowerCenter Designer to transform and load the data from various source systems like Oracle, Teradata, Sybase into the Oracle target database.
  • Analyzed and understood all data in the source databases and designed the overall data architecture and all the individual data marts in the data warehouse for each of the areas Merchandising, CRM, Finance, and Store Operations.
  • Handled alerting mechanisms, system utilization issues, performance statistics, capacity planning, integrity monitoring, population, maintenance, reorganization, security, and recovery of databases.
  • Involved in quality assurance of data, automation of processes.
  • Involved in the development and testing of individual data marts, Informatica mappings and update processes.
  • Created synonyms for copies of time dimensions, used the sequence generator transformation type to create sequences for generalized dimension keys, stored procedure transformation type for encoding and decoding functions and Lookup transformation to identify slowly changing dimensions.
  • Used Task developer in the Workflow manager to define sessions
  • Assisted retailers in understanding consumer buying patterns and in creating consumer sets for marketing campaigns
  • Created application-specific Data Marts so that users can access personalized dashboards of information that is specific to their department and business unit.
  • Responsible for monitoring all the sessions that are running, scheduled, completed and failed.
  • And also to debug the mapping when the session failed.
  • Defined the program specifications for the data migration programs, as well as the necessary test plans used to ensure the successful execution of the data loading processes.
  • Created repository, users, groups and their privileges using Informatica Repository Manager
  • Involved in writing UNIX shell scripts for Informatica ETL tool to run the Sessions.
  • Connection management and scheduling of jobs to be run in the batch process
  • Generated detailed reports from the data marts using Business Objects.

Environment:

  • Informatica PowerMart/PowerCenter 6.1/6.2 (Source Analyzer
  • Warehouse designer
  • Mapping Designer
  • Transformation Developer)
  • Oracle 8i
  • Sybase
  • T-SQL
  • SQL Server 2000
  • Teradata
  • DB2 V6.1
  • Erwin 4.0
  • Business Objects
  • HP-UNIX
  • Windows NT.

Confidential, Chantilly, VA

Data Warehouse Developer

Responsibilities:

  • Developed number of Complex Informatica Mappings, Mapplets and Reusable Transformations to facilitate one time, Daily, Monthly and Yearly Loading of Data
  • Helped the USPS to migrate data related to customer evaluations of USPS service and product performance from the Operational Data sources to an Enterprise Date Warehouse
  • Involved in fixing invalid Mappings, testing of Stored Procedures and Functions, Unit and Integration Testing of Informatica Sessions, Batches and the Target Data
  • Developed PL/SQL Business Functions and calling into the Cognos Tools.
  • Designed and Developed Power Prompts for Web Users For Flexible Querying
  • Extracted the Data items from different sources like Sybase, Oracle, DB2, and flat files and loaded them into Oracle financials target.
  • Responsible to design, develop and test the software (Informatica, PL SQL, UNIX shell scripts) to maintain the data marts (Load data, Analyze using OLAP tools).
  • Created application specific data marts so that the customer evaluation data in the Data Warehouse will then be available to assess current USPS capabilities to meet customer expectations, measure the success of USPS efforts to improve operations, and to reward strong performance.
  • Business intelligence reporting development to design, develop and implement operational reporting and OLAP analysis.
  • Extracted Erwin physical models into repository manager using Informatica Power plug.
  • Provided data modeling services for a large data warehouse design.
  • Involved in the development and testing of programs, Informatica mappings and update processes.
  • Designed and developed complex Aggregate, Join, Router, Look up and Update transformation rules (business rules).
  • Developed schedules to automate the update processes and Informatica Sessions/Batches
  • Involved in fixing repository related, Reader process and External procedure problems.
  • Involved loading data to Warehouse tables, loading data to staging area using SQL Loader.
  • Involved in quality assurance of data, automation of processes, programming for data manipulation and graphics, validation of programming, and presentation of results.

Environment:

  • Informatica PowerCenter/PowerMart 5.1
  • Cognos Powerplay & Impromptu 6.6
  • Crystal Reports
  • Sybase
  • Oracle
  • DB2
  • SQL Server (7.0)
  • SQL
  • PL/SQL
  • Erwin
  • Sun Solaris and Windows NT.

Confidential

Software Consultant

Responsibilities:

  • Developed commercial banking software using Borland C as the development language on Windows 95 operating system.
  • Used the Javas Unit testing tool Junit, to generate the unit tests and run them on the java front end.
  • Performed functional and regression testing over the Java GUI using Winrunner.
  • Added visual recorders in the form of Database Checkpoints used to validate the selected contents of the ODBC compliant bank database within a test case.
  • Tested a large number of source files together by compiling, running and collecting reports by modifying the build file inside the ANT tool
  • Collected and analyzed the reports using C/Perl scripts

Environment:

  • Oracle
  • SQL* Plus
  • Windows 95
  • Borland C
  • Jtest
  • Winrunner
  • Junit 2.0
  • ANT 2.0

Hire Now