We provide IT Staff Augmentation Services!

Sr. Data Warehouse Developer Resume

2.00/5 (Submit Your Rating)

San Francisco, CA

SUMMARY

  • Over 9+ years of experience in Analysis, Design, Development of various business applications in different platforms in data warehousing using Informatica 9.x/8.x/7.1/6.2, Oracle, Sybase, Teradata, SQL server.
  • Extensive knowledge of various data sources like Oracle, Teradata, MS SQLServer, Flat files and HL7 files.
  • Expertise in Data Warehousing, Data Migration, Data Integration using Business Intelligence (BI) tools such as Informatica Power Center, Power Exchange CDC, B2B Data Transformation, Informatica Data Quality, Informatica Data Integration, MDM, SSIS, OBIEE, Cognos, etc.,
  • Extensive knowledge with Relational &dimensional data modeling, star schema/snowflakes schema, fact and dimension tables and process mapping using the top - down and bottom-up approach.
  • Extensively worked on Informatica Designer Components - Source Analyzer, Warehousing Designer, Transformations Developer, Mapplet and Mapping Designer.
  • Coded several Shell Scripts for auditing UNIX in developing tools and system applications.
  • Good exposure inInformatica MDMwhere data Cleansing, De-duping and Address correction were performed.
  • Experience in Working with Informatica DVO (Data Validation Option) tool to Validate the archived jobs.
  • Hands on experience with mappings from varied transformation logics like Unconnected and Connected Lookups, Router, Aggregator, SQL transformation, Joiner, Update Strategy, Java Transformations and Re-usable transformations.
  • Created ETL mappings using Informatica Power center to move Data from multiple sources like Flat files, Oracle into a common target area such as Data Marts and Data Warehouse.
  • Good experience in Data Modeling concepts: Star Schema and Snow Flake Schema.
  • Extracted Data from multiple operational sources of loading staging area, Data Warehouse and data marts using CDC/ SCD (Type1/Type2/Type3) loads.
  • Involved in the data analysis for source and target systems and good understanding of Data Warehousing concepts, staging tables, Semantic layer.
  • Extensive knowledge in Data Quality using InformaticaIDQ (Developer and Analyst tools).
  • Experience in PL/SQL Programming (Stored procedures, Triggers, Packages) using Oracle and have a good knowledge in Sybase.
  • Developed excellent professional skills by working independently and also as a team member to analyze the Functional/ Business requirements and to prepare test plans, test scripts. Collaborated onsite teams, interacted and well managed various offshore teams.
  • Have clear understanding of Business Intelligence and Data Warehousing Conceptswith emphasis on ETL and SDLC, Quality analysis, change management, compliance & Disaster recovery.

TECHNICAL SKILLS

ETL/Data-Modeling Tools: Informatica Power Center 9.6.1, 9.5.1/9.1/8.6.1/8.1/7.1/6. x (Repository Manager, Designer, Server manager, Work Flow Monitor, Work Flow Manager), SQL*Loader, SSRS, B2B (DT/DX), IDQ 9.X (Developer and Analyst tools)

RDBMS: Oracle 11g/10g/9i, DB2, Teradata, SQL Server, SQL, PL/SQL

Tools: DB Visualizer, SQL Assistant SQL*Plus, TOAD, SQL*Loader, TSA

Engagement Experience: Metadata Development/Customization, Applications Development, Application / Integration Testing, Report Development, ETL Development, Query Performance Tuning, Application Debugging.

Operating Systems: Windows XP/ 2008/2007/2003/ NT/98/95, UNIX, LINUX.

Environment: HTML, XML, SQL, PL/SQL, MYSQL, JavaScript

PROFESSIONAL EXPERIENCE

Confidential, San Francisco, CA

Sr. Data warehouse Developer

Responsibilities:

  • One of the responsibilities of being a senior developer is to involve in meetings with BA’s to derive the transformation rules that need to be applied for the identified sources.
  • Configured Address Doctor which can cleanse whole world address Data and enhanced by making some modifications during installation
  • Planned, created and executed Workflows to integrate data from varied sources like Oracle, DB2, flat files and SQL databases and loaded into landing tables of Informatica MDM Hub.
  • Responsible for the Extraction, Transformation and Loading via Informatica 9.6.1 for RDBMS sources, and Informatica Power Exchange for Main frame files.
  • We had cases in premium and loss data where we have implemented various loads like Daily Loads, Weekly Loads, and Quarterly Loads using Incremental Loading Strategy.
  • UsedInformaticaVersion Controlfor checking in all versions of the objects used in creating the mappings, workflows to keep track of the changes in the development, test and production environment.
  • Extensively worked with Slowly Changing Dimensions Type1 and Type2 for Data Loads and also used source qualifiers and adding left, outer and inner joins. We had cases where we need to apply pre SQL and post SQL in Informatica sessions.
  • As part of moving them to persistent stage we have applied all the transformations that need to be done on individual fields and used Informatica Push down Optimization (PDO) for loading data faster.
  • Resolved Inconsistent and Duplicate Data to Support Strategic Goals with Multi domain MDM
  • Wrote several Shell Scripts for auditing databases also written high level shell scripts for 340 byte files, and deploy them between environments, Involved in the Unix standards committee for code walk thru of other developers programs.
  • Involved in everyday scrum meetings with the team in regarding to Design Review, code review, test review, and helped out with each one of them for the continuous flow of agile project.
  • Worked with different Caches such as Index cache, Data cache, Lookup cache (Static, Dynamic and Persistence) and Join cache while developing the Mappings.
  • Creating Parameter files in UNIX environment and deploying them from dev to stage to production, creating and using parameters and variables and using them in workflows, also creating Informatica connections for workflows in all environments, deploy them.
  • Created rules with Data Validation Option tool (DVO) to read metadata repositories to ensure data matches with validation option, to identify data inconsistencies.
  • Imported & exported test metadata like table pairs, lookup views & join views using Data Validation Option tool (DVO) and ran reports to display test definitions & results.
  • Invoked Data Validation Option (DVO) at command line to schedule test execution embedded like a specific test as a part of workflowand create a process that moves data to target as well runs validation via DVO.
  • Tested table pairs including tests like Count, Count Distinct, Count Rows, Min, Max, Avg& Sum by generating variety of reports for summary of testing activities like table pair summaries& detailed Test results.
  • Worked with Offshore QA team by sharing resources & knowledge transfer on how to validate the results and extensively resolved defects up on basis of project standards with Unit test/UAT testing, Carried out end to end testing and supported UAT effort with immediate requirement document change/fixes/resolution for all changes/defects.
  • Built delivery layer on top of the functional layer which is basically a replica of output of functional layer for business. Participated in UAT meetings with business, and worked on reconciliation with business requirements versus the developed product.
  • Used Jira tool for Agile project management upon creating tasks updating work efforts and viewing burn down charts & used Jazz data management tool for raising tickets for Defects and resolving them.

Environment: Informatica Power Center 9.6.1, MDM, Mainframe, IBM DB2, DT/DX, SQL Server, PL/SQL,DB Visualizer, SQL Assistant, TOAD, Data Validation Option(DVO), Autosys, UNIX/LINUX.

Confidential, Dallas, TX

Sr. Data warehouse Developer

Responsibilities:

  • Worked with business analysts for requirement gathering, business analysis, and translated the business requirements into technical specifications to build the Enterprise data warehouse
  • Involved in various knowledge transfers from dependent teams understand the business activities and application programs and document the understandings for internal team referencing.
  • Design database table structures for transactional and reference data sources.
  • Filtered XML claims files by using filter conditions on D9 segment and converted back the filtered claims xml files to EDI format using serializer in B2B data transformation.
  • Source data analysis and data profiling for data warehouse projects.
  • Successfully managed in setting up the data and configured the components needed by Hierarchy Manager for MDM HUB implementation which included in implementing hierarchies, relationships types, packages and profiles by using hierarchies’ tool in model workbench.
  • Design and implement all stages of data life cycle. Maintain coding and naming standards.
  • Involved in the data analysis for source and target systems and good understanding of Data Warehousing concepts, staging tables, Dimensions, Facts and Star Schema, Snowflake Schema.
  • Initiate the data modeling sessions, to design and build/append appropriate data mart models to support the reporting needs of applications.
  • Design and develop data modeling concepts using various data modeling tools like star schema and dimensional modeling
  • Extract data from heterogeneous data sources with multiple databases like Oracle, SQL Server, Fixed Width and Delimited Flat Files and transformed into a harmonized data store under stringent business deadlines.
  • Used features like email notifications, scripts and variables for ETL process using Informatica Power Center.
  • Worked on data cleansing and standardization using the cleanse functions in Informatica MDM, Cleansed the data usingMDMtechnique
  • Involved in Data Extraction from Oracle and Flat Files using SQL Loader Designed and developed mappings using Informatica Power Center.
  • Extensively worked on Data Cleansing, Data Analysis and Monitoring Using Data Quality.
  • Written shell scripts for parameter files that needs to be used in Environment.
  • Extensively used Transformations like Source Qualifier, Expression, Filter, Aggregator, joiner, and lookup, Sequence Generator, Router, Sorter and Stored Procedures, Java Transformations.
  • Used debugger to test the mappings and unit tested mappings.
  • Involved in fixing of invalid Mappings, Performance tuning, testing of Stored Procedures and Functions, Testing of Informatica Sessions, Batches and the Target Data.
  • Developed slowly changed dimensions (SCD) Type 2 for loading data into Dimensions and Facts.
  • Coded highly complex SQL/PLSQL quires and written query plans and troubleshooted quires that are created statistics on tables and optimized the run time for records that are gigantic.
  • Using Toad to increase User productivity and application code quality while providing an interactive community to support the user experience.
  • Developed and tested all the Informatica mappings, sessions and workflows - involving several Tasks.
  • Expertise in doing Unit Testing, Integration Testing, System Testing and Data Validation for developed Informatica Mappings.
  • Used Data Validation Option (DVO)to compare masked values against original data to guarantee that all sensitive values are different, confirm that all substituted values come from a data dictionary, and validate that masked data is in the proper format and not null.
  • Carry out Defect Analysis and fixing of bugs raised by the Users.
  • Extensively used ETL to load data from Flat files which involved both fixed width as well as Delimited files and also from the relational database, which was Oracle 11g.
  • Imported metadata from different sources such as Relational Databases, XML Sources and Impromptu Catalogs into Frame Work Manager.
  • Involved, Conducted and participated in process improvement discussions and recommending possible outcomes and focused on production application stability and enhancements.

Environment: Informatica Power Center 9.5.1, MDM, DB2, Mainframe, B2B,DT/DX, Oracle 11g,SQL Server 2000, SSRS, PL/SQL, TOAD, Data Validation Tool(DVO), Autosys, UNIX.

Confidential, San Francisco, CA

Data warehouse Developer

Responsibilities:

  • Involved in Analysis, Design and Development, test and implementation of Informatica transformations and workflows for extracting the data from the multiple systems.
  • Worked cooperatively with the team members to identify and resolve various issues relating to Informatica.
  • Designing mapping templates to specify high-level approach.
  • Involved in massive data cleansing and data profiling of the production data load.
  • Designing and implementing ETL environments using ETL strategies as well as tools Like Informatica Power Center 8.x/9.x, Power Exchange, Metadata Manager, IDQ and B2B.
  • Developed several Power Shell scripts to gather data related to job history, policy evaluation, Backup durations and so on using Central Management Server & SSRS subscriptions.
  • Built the Physical Layer /Business Model and Mapping Layer/ Presentation Layer of a Repository by using Star Schemas.
  • Performed data validation, reconciliation and error handling in the load process.
  • Test data to ensure data is masked. Unit Testing, Integration Testing, System Testing and Data Validation are common practices to test developed Informatica Mappings.
  • Extensively worked with Informatica - Source Analyzer, Warehouse Designer, Transformation developer, Mapplet Designer, Mapping Designer, Repository manager, Workflow Manager, Workflow Monitor, Repository server and Informatica server to load data from flat files, SQL Server.
  • Create scripts for transforming unstructured data to structured format using Informatica B2B Data Transformation.
  • Involved in PL/SQL programs like procedures, function, packages, and cursors to extract data from Target System.
  • Processed Vendor Address elements like Cleansing and converting using Informatica.
  • Created Informatica mappings to handle complex flat files and load data into warehouse.
  • Designed the mappings between sources (files and databases) to operational staging targets.
  • Developed custom selection of reports ordering using SQL Server Reporting Services (SSRS)
  • Configured and implemented the first instance of Informatica B2B - Data Transformation and Data Exchange.
  • Java transformation, Aggregator, sequence, look up, expression, filter, Joiner, Rank, Router, Sequence generator, Update Strategy transformations used in this populating data process.
  • Involved in Informatica Repository migration.
  • Involved in using the Stored Procedures, Functions, Materialized views and Triggers at Data Base level and imported them in Informatica for ETL.
  • Designed and Developed the Informatica workflows/sessions to extract, transform and load the data into oracle Server.
  • Translated Business specifications into PL/SQL code. Extensively used Explain Plan. Designed, Developed and fine-tuned Oracle Stored Procedures and triggers.
  • Defects are logged and change requests are submitted using defects module of Test Director
  • Worked with different Informatica tuning issues and fine-tuned the transformations to make them more efficient in terms of performance.

Environment: Informatica Power Center 9.1/8.6, PL/SQL, B2B,DT/DX, DB2, Oracle 11g, SQL Server 2000, Windows 2000, Shell Scripting, Autosys.

Confidential, Santa Clara, CA

ETL Informatica Developer

Responsibilities:

  • Worked with business analysts to identify appropriate sources for Data warehouse and to document business needs for decision support data.
  • Designed the ETL processes using Informatica to load data from Oracle, Flat Files, XML Files to target Oracle Data Warehouse database.
  • Extensive worked on Informatica 8.1 (mappings, session, and workflow) in creating various mappings focusing on SCD2 and SCD1 implementation.
  • Performed data manipulations using various Informatica Transformations like Joiner, Expression, Lookup, Aggregate, Filter, Update Strategy, and Sequence Generator etc.
  • Involved in creating logical and physical data modeling with STAR and SNOWFLAKE schema techniques using Erwin in Data warehouse as well as in Data Mart.
  • Written pre-session and post session scripts in mappings.
  • Created sessions and workflow for designed mappings.
  • Extensively worked in the performance tuning of the programs, ETL Procedures and processes.
  • Worked with Operational Data Store (ODS).
  • Developed PL/SQL procedures for processing business logic in the database.
  • Migrated the mapping to the testing and production department and introduced the concepts of Informatica to the people in testing department.
  • Using Perforce as a versioning tool for maintaining revision history for code.
  • Performed Code review to ensure that ETL development was done according to the company’s ETL standard and that ETL best practices were followed
  • Worked on ETL strategy to store data validation rules, error handling methods to handle both expected and non expected errors and documented it carefully.
  • Used Update Strategies for cleansing, updating and adding data to the existing processes in the warehouse.
  • Designed and developed UNIX shell scripts as part of the ETL process to compare control.
  • Involved in Performance Tuning by determining bottlenecks at various points like targets, sources, mappings, sessions or system. This led to better session performance.
  • Involved in migrating objects from DEV to QA and testing them and then promoting to Production

Environment: Informatica8.1, DB2, ORACLE 9i, XML, Windows NT, UNIX Shell Programming.

Confidential, Edison, NJ

ETL Developer

Responsibilities:

  • Effective Communication with data architects, designers, application developers and senior management in order to collaborate on projects that involve multiple teams in a vitally time-sensitive environment.
  • Effectively involved in allocation & review of various development activities / task with offshore counter apart.
  • Understanding the business logic behind every piece of code and documenting requirements in a reverse engineering fashion
  • Responsible for documenting and resolving any production issues.
  • Provided end-user training and support.
  • Designed ETL process to translate the business requirements into Mappings using Informatica Power Center - Source Analyzer, Warehouse designer, Mapping Designer, Workflow Manager/Monitor.
  • Involved in design, development and implementation of the Enterprise Data Warehouse (EDW) and Data Mart.
  • Used external tools like Name Parser and Address Cleansing for cleansing the data in source systems.
  • Designed mappings using Source qualifier, Joiner, Aggregator, Expression, Lookup, Router, Filter, and Update Strategy transformations and Mapplets load data into the target involving slowly changing dimensions.
  • Used Workflow Manager for creating and maintaining the Sessions and Workflow Monitor to monitor workflows.
  • Enhanced existing UNIX shell scripts as part of the ETL process to schedule tasks/sessions.
  • Carried out unit and integration testing for Informatica mappings, sessions and workflows.
  • Coordinated with end users and reporting teams to correlate Business requirements
  • Analysis of specifications provided by the client.
  • Setting up ETL metadata.
  • Used Autosys as a scheduling tool for triggering jobs.
  • Using Perforce as a versioning tool for maintaining revision history for code.

Environment: Informatica Power Center 8.1, Oracle10g/9i, MS SQL server 2005,Business Objects, Autosys,Toad 7.6, SQL, PL/SQL, Unix Shell Scripting, Windows.

Confidential

Jr. ETL Developer

Responsibilities:

  • Involved in design and development of data warehouse environment, liaison to business users and/or technical teams gathering requirement specification documents and presenting and identifying data sources, targets and report generation.
  • Used reverse engineering in Erwin 4.x to understand the existing data model of the data warehouse.
  • Worked on Informatica Power Center 7.1 tool - Source Analyzer, warehouse designer, Mapping Designer, Workflow Manager, Mapplets, and Reusable Transformations.
  • Optimized Query Performance, Session Performance and Reliability, did performance tuning of Informatica components for daily and monthly incremental loading tables.
  • Used Normalizer, Router, Lookup, Aggregator, Expression and Update Strategy Transformations.
  • Created complex workflows with multiple sessions and worklets with consecutive sessions.
  • Used Workflow Manager for creating validating, testing and running sequential batches.
  • Implemented source and target based partitioning for existing workflows in production to improve performance so as to cut back the running time.
  • Involved in the Migration process from Development, Test and Production Environments.
  • Used SQL tools like TOAD to run SQL queries and validate the data.
  • Wrote stored procedures and triggers in Oracle 9i for managing consistency and referential integrity across data mart.
  • Tuning the Mappings for Optimum Performance, Dependencies and Batch Design.
  • Schedule and Run Extraction and Load process and monitor sessions using Informatica Workflow Manager.
  • Scheduled the batches to be run using the Workflow Manager.
  • Involved in identifying bugs in existing mappings by analyzing the data flow, evaluating transformations and fixing the bugs so that they conform to the business needs.
  • Involved in migrating Informatica ETL application and Database objects through various environments such as Development, Testing, UAT and Production environments.
  • Documented and presented the production/support documents for the components developed when handing-over the application to the production support team.

Environment: Informatica Power Center 7.1, Oracle9i,Erwin 4.x, MS SQL server 2005,Erwin, Robot, Toad 7.6, SQL, Unix Shell Scripting .

We'd love your feedback!