Etl/bw Analyst Resume
SUMMARY:
- 7+ years of IT experience in Data Warehouse, ETL, Business Intelligence, Testing, documentation and implementation of client/server applications.
- Strong technical background and experience in ETL using Informatica.
- Vibrant, result-aligned skill set in the implementation of Data Warehouse and Business Intelligence tools like Informatica PowerCenter/PowerMart (Designer, Workflow Manager, Workflow Monitor, and Repository Manager).
- Proficient in Data Warehousing project life cycle such as dimensional modeling, data extraction, transaction and loading, schemas (Star Schema and Snow-Flake Schema).
- Excellent skills in PL/SQL and SQL, with working knowledge of Databases like Oracle, DB2 and MS SQL Server.
- Hands-on experience on Agile Methodology.
- Strong technical exposure with good degree of competence in interacting with business users to analyze the business process and requirements, transforming requirements into screens, designing objects, documenting and rolling out the deliverables.
- Profound insight in to determine priorities, schedule work, and meet critical deadlines within budgetary guidelines.
- Team player with Excellent Communication and Presentation skills.
- Ability to work independently as well as in a team, fun-filled and challenging environment.
- Rated as ISTJ by Myers-Briggs Type Indicator.
TECHNICAL SKILLS:
ETL Tools: Informatica PowerCenter 8.6/8.1/7.1/6.x/5.x, PowerExchange8.6, DataStage.
OLAP Tools: Business Objects 6.5, COGNOS ReportNet 1.1MR2.
Testing: Manual, Automated testing using QTP, Mercury WinRunner & Load Runner.
Data Modeling Tools: Erwin, Framework Manager.
Languages: C, C++, SQL, Perl, PL/SQL, VB/Java Script & HTML.
Web Technologies: XML, HTML & Java Script.
RDBMS: Oracle 9i, MYSQL, Teradata, DB2, MS SQL Server 2000/7.0 & MS-Access.
Operating Systems: Windows NT/2000/XP, UNIX and DOS.
MS-Office: Word, Excel, Visio & Power Point.
Scheduling Tools: Tidal, Control-M.
EXPERIENCE:
Confidential,IL April’11 to Till Date
Project Name: Dynamic Pricing Re-Architecture
Role: ETL/BW Analyst
Confidential,envisioned Dynamic Pricing Re-Architecture to process sales, inventory and pricing data from different source systems like ---- Mainframe, Teradata, DB2, Mysql and Flat Files to build a
warehouse. Develop an integrated one-source platform for all web intelligence applications to provide analytics for pricing users.
Responsibilities:
- Transforming business rules into Technical Requirements and creating mapping documents.
- Worked on loading of data from several flat file sources.
- Created UNIX shell scripts to load files.
- Fine tuned & modified the existing Informatica processes and SQL scripts.
- Worked on PL/SQL programming – Stored Procedures, Triggers, Functions and Packages.
- Coordinated offshore team in developing Informatica mappings.
- Assisted the team in development of design standards and code for effective, scalable and
maintainable ETL processes.
- Used Informatica power center for (ETL) extraction, transformation and loading data from
heterogeneous source systems.
- Data analysis and ETL for Dynamic Pricing Re-architecture project on Hadoop platform.
- Built sqoop scripts to load data from DB2, MySQL to Hadoop file system.
- Developed fast export scripts to load data from Teradata to flat file then copied the data to HDFS.
- Developed SQL scripts to load history and incremental data.
- Worked on HIVE tables and scripts to create HIVE tables.
- Used Hue for querying hive tables.
- Used Control M for automation of jobs.
- Prepared various documents---Data Mapping, ETL Standards, Technical and Deployment
Documents.
Environment: Informatica PowerCenter 8.6.1, Teradata, Control M, Hadoop, Hue, Hive, Sqoop, IBM DB2, Oracle, MySQL, UNIX (Sun OS), Win XP, MS ACCESS, TOAD, SQL Developer, SQLyog Enterprise.
Confidential,Great Falls, MT April’09 to April’11
Project Name: Health Informatics
Role: ETL-Application Software Engineer
Confidential,is a multi-line healthcare enterprise that provides programs and related services to individuals receiving benefits under Medicaid, including Supplemental Security Income (SSI) and the State Children\'s Health Insurance Program (SCHIP) and also contracts with other healthcare organizations to provide specialty services including behavioral health, disease management, managed vision, nurse triage, pharmacy benefit management and treatment compliance.
The main aim of the project is to build EDW- enterprise data warehouse, it consists of four schemas to segregate the data into separate ETL processes. One ETL_STAGE_OWN, consist of tables that resemble source table structures from source systems like AMISYS, CACTUS and CCMS. The second stage ETL_TRANSFORM_OWN, consist of tables used in the transformation of source data into intended EDW structure. The third stage ETL_LOAD_OWN, consist of the star schema structure tables for EDW. The final stage ETL_ACCESS_OWN is the data access layer of EDW. In this layer the end users information delivery application (Business Objects) will expose the detail data to end users and allows for snapshot version of data as required.
Responsibilities:
- Developed real time mappings with the sources registered in CDC to the staging area.
- Used Stencil to create various mapping layouts.
- Imported Oracle, SQL Server and Flat file sources in to Informatica to build mappings as per business requirements.
- Created load layer mappings using various transformations like Connected and Un-Connected Look Ups, Filter, Aggregator, Union, Joiner, Expression, Router, Update and Normalizer.
- Used SQL Override to handle inconsistent data and also to avoid the usage of Joiner Transformation.
- Used Variable Port in Expression Transformation to get the history values and built conditions based on the business rules.
- Created Mapplets and Re-Usable Transformations to enforce business logic reusability.
- Designed mappings with mapping parameters, variables and expression variables.
- Created mappings to populate Type I, Type II and Type III Slowly Changing Dimensions.
- Developed mappings involving multiple pipelines and ordered the load using Target Load Plan.
- Used Constraint Based Load Ordering to populate multiple targets in the same mapping.
- Created Re-Usable e-mail task, included in the sessions for notification in the event of session failure.
- Used Versioning in PowerCenter to track different versions of mappings, sessions and workflows.
- Applied Labels to ease the migration process from DEV to TEST.
- Created various queries in the Query Window to search various objects in the repository.
- Used Parameter Files to provide real time data for the mappings and sessions.
- Implemented Decision Task in workflows to control the flow of execution.
- Analyzed Session Logs and Workflow Logs to monitor the execution of mapping.
- Populated dimensions and facts in Snow Flake and Star Schema.
- Involved in designing mappings to reduce performance overhead.
- Created Unit Test Plan Documents and Technical Specification Documents for various mappings.
- Unit tested various mappings using Debugger in PowerCenter.
- Performed Cold / Warm Start for recovery.
- Scheduled and Monitored jobs in Tidal.
- Participated in Peer Reviews and suggested changes to enhance performance.
- Monitored and restarted PowerExchange CDC Condenses and Listeners.
- Supported production jobs and resolved failures to meet SLA.
- Predicted performance issues and tuned productions jobs for possible bottlenecks.
Environment: Informatica Power Center 8.6,PowerExchange,Business Objects, Oracle SQL Developer, Windows NT/XP, UNIX, Teradata, SQL Server, Erwin, TOAD, SQL Server, Data Stencil.
Client Name:Confidential,Southborough, MA April’08 to March’09
Project Name: Business Intelligence Solution Role: ETL DeveloperConfidential,is a textile company having number of stores and outlets. Its major functionalities are production, sales, quality, materials and financial operations. The database has lots of historical data. So data warehouse plays a major role in enabling various stores to view the data at a lowest level and help them to make decisions quickly and accurately, to bring more revenue to company with new policies. It ultimately improves an organization’s ability to respond to the increasingly complex and dynamic requirements of its customers and marketplace.
Responsibilities:
- Involved in designing of mappings according to the requirement specifications and created sessions.
- Modified, optimized the existing mappings as per the documents prepared by Business Analysts.
- Extracted data from SQL Server and Flat file sources and performed Joiner, Expression, Aggregate, Lookup, Stored Procedure, Filter, Router Transformations and Update strategy transformations to extract and load data into the target systems as per business rules.
- Involved in analyzing the source data from different data sources such as Oracle and Flat files.
- Involved in ETL performance assessment and tuning.
- Implemented reusable business logic using mapplets.
- Designed worklets and involved in scheduling workflows.
- Compared code in Dev/Test/Prod for Code Consistency.
- Involved in designing the Oracle database tables according to the business users requirements.
- Partnered with Business Users and DW Designers to understand the processes of Development Methodology, and then implement the ideas in development accordingly.
- Created Oracle PL/SQL queries and Stored Procedures, Packages, Triggers, Cursors.
- Involved in identifying and tracking the slowly changing dimensions (SCD)
- Created reusable Mailing alerts, Events, Tasks, Sessions, Reusable Worklets and Workflows in Workflow manager.
- Designed mappings with mapping parameters, variables and expression parameters.
- Implemented data processing like Sequential Batch Processing, Parallel Batch Processing and Control Flow.
- Used Confio to monitor query performance.
- Interpreted the threads in session log to pin point the bottlenecks.
- Used UNIX scripts in post session commands to handle files.
- Used decision task to control the flow of execution in a workflow.
Environment: Informatica 7.1, COGNOS ReportNet 1.1MR2, SQL 2000, Windows 2000/2003, Mercury TestDirector7.6, Oracle, TOAD.
Client Name: Confidential,Brookfield, WI, USA. Feb’07 to March’08
Project Name: GL Analyzer
Role: ETL Developer
Confidential,objective is to replace the existing FADB (Financial Analysis Data Base) solution. FADB solution is currently live in production and is divided into three databases (AM, AS and EU) based on the poles (America, Europe and Asia). GL analyzer focuses to have a global data warehouse and common model to user interfaces, serving all the global users giving better performance to query financial transactions done in different source systems. The source systems data includes vendors, customers, employees and currencies systems. The GL Analyzer project will bring in the over-all GL functionality into EDW (Enterprise Data Warehouse), as it exists today in GE-HealthCare FADB database.
Responsibilities:
- Analyzing the source data coming from various databases and files; working with business users and developers to develop the data model.
- Identified data source systems integration issues and proposed feasible integration solutions.
- Used Erwin to identify Fact and Dimension Tables and involved in dimensional modeling to design and develop STAR Schemas and Snowflake schemas.
- Partnered with Business Users and DW Designers to understand the processes of Development Methodology, and then implement the ideas in development accordingly.
- Created Oracle PL/SQL queries and Stored Procedures, Packages, Triggers, Cursors.
- Involved in designing various mappings with Business rules using the transformations.
- Created reusable Mailing alerts, Events, Tasks, Sessions, Worklets and Workflows in Workflow manager.
- Implemented data processing like Sequential Batch Processing, Parallel Batch Processing and Control Flow.
- Scheduled the workflows at specified frequency according to the business requirements and monitored the workflows using Workflow Monitor.
- Involved in performance tuning of Source and Target databases, Mappings and Sessions, improved performance by identifying performance bottlenecks.
- Fixing invalid Mappings, Debugging the mappings in designer, Unit and Integration Testing of Informatica Sessions, Worklets, Workflows.
- Involved in testing of ETL objects based on functional requirements.
- Produced a Unit test document, which captures the test conditions and scripts, expected/actual results and data cross-section and volume for Individual interfaces.
- Created performance metrics document and Transformation specification for Informatica Mappings.
- Envisioned remedies for performance bottlenecks.
Environment: Informatica7.1/6.2, Teradata; Oracle 9i/8i, PL/SQL, SQL, DB2, Unix SunSolaris 8.0, Windows2000, ErWin4.0, COBOL, XML, Mercury TestDirector7.6, TOAD, Terminal Services Client.
Confidential,Hyderabad, India Nov’05 to Jan ‘07
Role: Programmer
Project # 1: SOP (Sales order processing)
Client: Confidential,
Confidential,It deals about how a customer places order for products. Invoice is generated once an order is placed and the same is mailed to the customers. If any product matches the reorder level then reorder form is generated and given to the supplier. Sales Report will be generated for a specific period. Additional reports like stock details, Goods Received note, Goods Returned note etc can be generated from the application.
Environment: Informatica Power Center7.1, SQL Server, Oracle 8i, PL/SQL, Windows 2000.
- Project # 2: Warehouse Management System
- Client: Confidential,
- This Project is developed for inventory control. The system contains master details like item, quantity on hand, expiry date, last issue date etc. The daily transactions like receipts/purchases issue, purchase returns and issue returns are entered. Item master is updated, so the stock status can be viewed at any time. Reports available with this system are adjustment report, slow/fast moving item report, stock ledger etc.
- Responsibilities:
- Participated in preparation of Software Requirement Specifications.
- Involved in creation of sessions, worklets and workflows and scheduled in Workflow Manager.
- Performed map testing in Designer according to the Mapping specs and prepared unit test documents with all the statistics.
- Identified bottlenecks and involved in performance tuning.
- Configured E-mail tasks to notify failures.
- Involved in data profiling.
Project # 3: Automation of Sales System
Client: Confidential,
The objective of the system is to automate the existing manual process of sales system and integrate with other systems. This includes modules for quotations, booking, stock reports, delivery information and bill generation.
Responsibilities:
- Involved in analysis, design, and development and testing of different modules.
- Developed Database Triggers, Stored Procedures, Functions and Packages.
- Created Files and entry screens. Also provided exception and error handling in the application.
- Documented the development process, for future reference.
- Created and tested use cases.
Environment: UNIX, Oracle 8i, SQL, and PL/SQL.
Confidential,Hyderabad, India April’03 to Dec’04
Project Name: Messaging System
Role: Software Engineer
Development of Instant Message Schedule System. This system helps the user to check the timetable for a particular course. The user has flexibility to send his request by sending the course code via SMS or through WAP or via internet using WWW.
Responsibilities:
- Worked with the PL SQL statements to extract data from tables and verify the output data.
- Involved in QA testing.
- Worked as a team member in testing system.
- Developed Test cases and wrote detailed test procedures to check the functionality of the application in accordance with user requirement and specification document.
- Worked with customized test tools to check the interface between modules and reported bugs to development team.
Environment: Windows, Oracle 8i, SQL, and PL/SQL.
EDUCATION
- Masters in Internet Systems
- Bachelor of Engineering in Electronics and Communications