We provide IT Staff Augmentation Services!

Etl Developer Resume

5.00/5 (Submit Your Rating)

SUMMARY

  • Over FiveYears of experience in IT Professional career in ETL, Business Intelligence and Data warehousing. Extensively worked on MS SQLServer2005/2008, MSBI Technologies like SSIS, SSAS, SSRS, Informatica, Unix.
  • Experience in software development life cycle, business requirement analysis, design, programming, database design, data warehousing and business intelligence concepts, Star Schema and Snow flake Schema methodologies.
  • Good understating of Ralph Kimball’s and Inman strategies in approach to design large data warehouses. Experience in working with slowly changing dimensions (Fixed,Changing and Historical).
  • Expert skill in data migration from various data sources like Excel, SQLServer, Flat files using SSIS packages and SQL commands. Monitoring, debugging and tuning ETL jobs.
  • Created SSIS packages to transfer data between OLTP and OLAP Databases with different types of control flow tasks and data flow transformations, securing and deploying the packages.
  • Experience in Validating and testing the SSIS packages on the development server.
  • Expert in creating various SQL server objects like database, schemas, tables, indexes,indexed views. Expertise in creating complex Stored Procedures (Dynamic and static), effective functions, and appropriate triggers to facilitate efficient data manipulation and data consistency.
  • Experience in Designing, Creating and processing of cubes using SSAS. Created and Configured Data Source and Data Source Views, Dimensions, Cubes, Measures, Partitions, KPI’s using SQL Server Analysis Services(SSAS 2005/2008).
  • Expert in calculating measures and dimension members using multi dimensional expression (MDX), mathematical formulas.
  • Experience in the Reports generation by using Authoring and Managing Components of SSRS from both relational databases and OLAP Cubes including MDX Development.
  • Experience in generating on- demand and scheduled reports for business analysis and management decisions using SQL Server Reporting Services (SSRS).
  • Experience in developing Custom Report and different types of Tabular Reports, Matrix, Reports, Ad hoc reports and distributed reports in multiple formats using SQL ServerReporting Services (SSRS) in Business intelligence development studio (BIDS).
  • Experienced in creating test data and unit test cases. Writing test cases and system plans to ensure successful data loading process.
  • In-depth knowledge of Relational Data Modeling, Dimensional data modeling and design. Extensive experience in data analysis using SQL and MS-Excel.
  • Experience in performance tuning, Query optimization and database consistency checks. Extensively used tools like SQL Profiler and Database Engine tuning wizard, Windows Performance.Experience in creating Jobs, SQL Mail Agent.
  • Experience in installation, up gradation and configuration of Microsoft SQL Server and Databases.
  • Good work experience on System analysis, Design, Development, testing and implementation of projects. Ability to profile and understand the different source systems. Very good skills of documenting different kinds of metadata.
  • Having excellent communication, Presentation skills, and Strong analytical and problem solving skills. Liaise with developers and user representatives in application design and document reviews.

TECHNICAL SKILLS:

Business Intelligence

MS SSRS 2005/2008, MS SSAS 2005 / 2008, Actuate 5,

ETL SKILLS

MS SQL Server SSIS 2005/2008, Informatica Power Center v 7.x, SQL, PL/SQL, ETL coding through PL/SQL, Export, Import, Unix Shell Scripting, Cron Job Development.

Databases

MS SQL Server 2005/2008, Oracle 10g and 9i, Microsoft Access, DB2 and Sybase.

Tools

Callidus 5, Toad, SQL Navigator, Erwin, MS-Excel and MS-Access.

Operating Systems

Windows XP/2003 Server/2000 Pro, Unix Sun Solaris/HP/AIX, Windows XP and Linux,

Data warehousing Methodologies/ Processes

Dimensional Modeling (Star Schema Modeling, Snowflake Modeling, Fact and Dimension Tables), Kimball Methodology,Inmon Methodology, Maintenance of Operational Data Store(ODS) and Enterprise Data warehouse (EDW), OLAP, Metadata Management, Data Migration and Data Cleansing Techniques, Data Profiling, Data Quality and Data Validation Scripts

Internet Software/Other Tools

HTML, XML, UML, Clear Quest, Case Studio and Harvest.

PROFESSIONAL EXPERIENCE

Confidential, New Jersy, Oct 2010 – Till date.
Campaign Effectiveness Reporting (CER) - A web-based reporting tool that utilizes Microsoft’s SQL Server Analysis ,SQL server Integration Services and Reporting Services to report and analyze the effectiveness of different marketing campaigns segment wise( for targeted people) and region wise for direct and indirect customers. This is used by various business groups for reporting and analysis of the campaigns.

Responsibilities:

  • As a ETL / BI Developer for the CER Data cube project involved in all the phases of the CER development.
  • Understanding existing business model and customer requirements. Detailed study and data profiling of the underlying application systems for the sales of the different campaigns promoted.
  • Elicit requirements using interviews, document analysis, business process descriptions, use cases, scenarios, business analysis, task and workflow analysis.
  • Developed the proto-type of CER and delivered six months of historical data in development environment to the business users for testing purpose.
  • Filtered bad data from legacy system using T-SQL and implemented constraints and triggers into new system for data consistency.
  • Identify the field/column level data quality issues of the source systems to drive the data quality, data cleansing and error checking in the SSIS packages. Identify and document the source to target packages.
  • Prepare the detailed level ETL package specification documents describing the algorithms and flowchart and mentioning the source systems, change data capture logic, transformation logic involved in each field, the lookup tables, lookup logic used and also the target systems per each individual package involved with the datamart. Also, document the important SSIS session properties that should be used for executing SSIS Packages from Terradata in to the SSIS Database.
  • Design and document the error-handling strategy in the ETL load process using Event handlers. Consistently use this error handling techniques in all the packages
  • Designed and built cube (CER Cube) using SQL Server analysis Services(SSAS 2008) with 10 dimensions and built daily partitions spanning 13 months and developed Aggregations,calculated members for the cube and generated daily reports to measure the sales of the segmented people across different regions for different campaigns.
  • Designed and developed the ETL data flow to populate the CER Cube Analysis Database using the SSIS packages (SQL Server Analysis Services). Scheduled and automated the packages to populate the data in the cube daily up to date for business reporting.
  • Created SSIS Packages by using advanced tools (i.e. pivot Transformation, Derived Columns,
  • Condition Split, Term extraction, Aggregations, Multicasting).Created SQL server configurations for SSIS packages and XML & Windows event logs.
  • Designed and deployed direct and indirect customers reports for targeted segmentation of people and across different regions with Drill Down approach. Also developed Drill-through reports to measure the performance detail of different distribution channels.
  • Created quarterly sales reports with Chart controls to measure the progress of sales every quarter for different distribution channels.
  • Created test data and unit test cases to ensure successful data loading process.
  • Used Notification services to generate error messages and send them to the user through e-mails. Created report snapshots to improve performance of SSRS.
  • Responsible for helping manage the daily operations of the company, meeting with business analysts, end-users for resolving the issues.

Environment: SQL Server 2008, SSAS 2008, SSRS 2008, SSIS 2008, Teradata V2R6.2.

Confidential Atlanta, GA, Sep 2009- Sep 2010.

Callidus Sales Compensation System for Small Business Services.(SBS)-Callidus Sales Compensation System for Bellsouth’s Small Business Services (SBS) - An Enterprise Incentive Management (EIM) application to develop and manage incentive compensation linked to the achievement of strategic business objectives which supports sales compensation for SBS outbound employees and vendors, distribution employees and SBS inbound managers and also represents the largest deployment of a packaged incentive software product at BellSouth with almost 6000 payees eligible for monthly and quarterly payments through Callidus used by Compensation Coordinators, HR Pay Administrators, Employees and Partners.

Responsibilities:

  • Data warehouse developer for the Callidus sales Compensation Small Business Services and BI projects team. Involved in all the phases of the Sales and Billing Datamart project from scratch and end to end and performed different roles in the different phases of the project.
  • Detailed study and data profiling of all the underlying transaction application systems for the Sales and Billing subject areas and understand the data models designed by the architect. Identify and capture the right metadata from source systems.
  • Developed the proto-type and delivered three months of historical data in development environment to the business users for a proof of concept.
  • Identify the field/column level data quality issues of the source systems to drive the data quality, data cleansing and error checking in the ETL packages.
  • Prepare the detailed level ETL specification documents describing the algorithms and flowchart and mentioning the source systems, change data capture logic, transformation logic involved in each field, the lookup tables, lookup logic used and also the target systems per each individual package involved with the datamart. Also, document the important SSIS session properties that should be used for executing SSIS Packages.
  • Design and developed mappings for loading the source data into the staging area and also for loading dimensions, facts and aggregate tables.
  • Performed data conversions from Flat files into a normalized database structure. Migrated Organization, Orders, Adjustments, Objectives data in various formats like textbased files, Excel spreadsheets, to SQL Server databases using SQL Server IntegrationServices (SSIS) to overcome the transformation constraints.
  • Wrote Queries for generating drill down and drill through reports in SSRS 2005.
  • Generated reports using SQL server Reporting services 2005/2008 from OLTP and OLAP data sources.
  • Designed and deployed reports with Drill Down, Drill-through and Drop down menu option and Parameterized and linked Reports. Developed Stored procedures and views to generate Drill- through reports, parameterized reports and linked reports.
  • Created reports with different types of properties like Chart controls, Filters, Interactive Sorting. Applied conditional formatting in the reports using SSRS to highlight key areas in the report data.
  • Created report snapshots to improve performance of SSRS.
  • Deployed and scheduled Reports using SSRS to generate all daily, Weekly, Monthly and quarterly Reports including current status for executives, Business analysts and end users for various categories and regions based on business needs.
  • Preparation of test scripts which includes complex sql scripts to compare source and target data. Execute the test scripts and validate the results. Also, co-ordinate with business users to perform the User Acceptance Test (UAT). Prepare migration and deployment plans and co-ordinate for deployment.
  • Scheduling and monitoring ETL packages daily load. Developed Exception handling process for each SSIS package. Tested the data with complex queries, joins and sub queries.
  • Provided Technical and Development Solutions for requirements raised by Clients. Used Backup/Restore and Normalization.
  • Involved in the enhancement/maintenance of the datamart. Done impact analysis and designed developed, tested and deployed a functionality for a new source system into the existing datamart.
  • Monitor historical data loads and on-going loads. Root cause analysis of datawarehouse production issues.
  • On call production support for regular data warehouse load process and off shore co-ordination.

Environment: SQL Server 2000/2005, SSIS 2005, Windows XP, MS Visio, Erwin.

Confidential May 2006 to April 2009

European Data Distribution Center (EDDC), a Datamart for GE CCF to provide a centralized platform for the European accounts transaction related to debtors and creditors status. 
Responsibilities:

  • Worked closely with executive sponsors, usersand decision makers to develop the transformation logic to be used in Informatica.
  • Extensively worked on Unix shell scripts and Pl/Sql stored procedures.
  • SDLC - from analysis to production implementation, with emphasis on identifying the source and source data validation, implementing business logic and used transformations as per the requirement in developing mappings and loading the data into the target.
  • Extensively used transformationslike router, aggregator, source qualifier, joiner and expression transformation. Developed Mappings for loading fact tables. Designed and developed reusable objects.
  • Scheduledthesessionstoextract,transformandload data into warehouse as per business requirements.

Environment: Informatica Power Mart 5.x, Business Objects, Oracle 7, Windows NT, Unix.

Total Payables System (TPS) for GE CCF, a data integration project between AS/400 and Oracle Financials. 
Responsibilities:

  • Understanding existing business model and customer requirements. Responsible for mandatory checks, date field checks and number field checks in Informatica. Incorporated error logging logic in all the mappings. Involved effectively with the user regarding source system problems.
  • Used various transformations like lookup, update strategy, router, filter, sequence generator, source qualifier/joiner on data and extracted according to the business rules and technical specifications
  • Designed the automation process and implemented data movement in Informatica without manual intervention. Created and executed Informatica sessions using server manager.
  • Involved in system integration and the entire system test and support of the project.
  • Developed shell scripts to repeat core session run for all given input files, shell script for doing ftp and sending status mail.
  • Improved the performance of the mappings using various optimization techniques.
  • Involved in the documentation of the project.

Environment: Informatica Power Mart 5.x, Oracle 7, Windows NT, Unix.

EDUCATION
  • Master of Business Administration M.B.A (Information Systems),

We'd love your feedback!