We provide IT Staff Augmentation Services!

Sql Etl Resume

Waltham, MA

SUMMARY

  • Over 9+ years of IT experience with a strong background in Data warehousing projects development and five years of ETL & Reporting experience using SQL Server SSIS/SSRS in various Industry verticals like Finance, Educational, Insurance, Health Care and Advertising and Marketing.
  • Over 4+ years of experience in Financial Investment area.
  • Over 1 year experience creating data warehouse based on the user’s activities on the website.
  • Developed solutions for AUM (Assets Under Management).
  • Involved in defining Portfolio and Security level attributes.
  • Involved in building Trading/Finance reports based on different products (Fixed Income, Equity, Structured and Alternative (Private Equity) products).
  • Involved in developing security classification system (MAC "Multi Asset Class": Equities, Derivatives, Debt Instruments aka Fixed Income) to run various holding reports.
  • Analyzed business and technical requirements, as well as logical data model to understand the detailed data management requirements.
  • Conversant with all phases of Software Development Life Cycle (SDLC) involving Systems Analysis, Design, Development and Implementation.
  • Experience in Installing, Configuring, Managing, Monitoring and Troubleshooting SQL Server 2008 R2/2008/2005.
  • Experience in Extraction, Transforming and Loading (ETL) using SSIS creating mappings/workflows to extract data from SQL Server, Oracle, Excel file, DB, SharePoint lists and Flat File sources and load into various Business Entities.
  • Expertise in SSIS direct and Indirect Configuration(XML, SQL Server) set up, SSIS Audit and Logging, Error Handling, Expressions and Variables.
  • Expertise in performance tuning of Data Flow Task in SSIS Packages.
  • Excellent backend skills in creating SQL objects like Tables, complex Stored Procedures, Views, Indexes, effective Functions and appropriate Triggers to facilitate efficient data manipulation and data consistency.
  • Experience in creating Historical and Incremental Jobs, Alerts, setting up SQL server mail agent for SSIS packages.
  • Expert in report writing using SQL Server Reporting Services (SSRS) and in creating various types of reports like Drill Down, Drill Through, Parameterized, Matrix, and Chart Report, Web reporting by customizing URL Access.
  • Expertise in writing T-SQL (DML & DDL) Code, Dynamic SQL, Query optimization & performance tuning. Used SQL Profiler, Execution plan, Performance Monitor and DBCC commands.
  • Experience in migrating databases from SQL Server 2005 to SQL Server 2008.
  • Proficient knowledge of SQL Server Administration Implementing Backup and Restore Strategy, capacity planning, database maintenance, user authorizations, user defined roles etc
  • Expertise in Creating Cluster Index and Non Cluster Index, Covered Index, and Indexed View to improve performance on Database side.
  • Experience in Dimensional Modeling, ER Modeling using Erwin, Star Schema / Snowflake Schema, Fact and Dimensional tables and Operational Data Store (ODS).
  • Expertise in deploying SSIS Package, SSRS Reports and SSAS Cubes using command line utilities.
  • Experience Working in Agile, Scrum and Rational Process (waterfall) model.
  • Excellent skill in Testing SSIS, SSRS, SSAS, TSQL objects. (White Box and Black Box)
  • Good Experience in writing Deployment Documents for SIT/QA/Prod environments, Mapping Document, Test Scenarios and Release documents.
  • Flexible, enthusiastic and project oriented team player with excellent written, verbal communication and leadership skills to understand and develop creative solutions for challenging client needs.
  • Maintained positive communications and working relationships at all business levels.

TECHNICAL SKILL

  • Finance Platforms

BARRA, Factset, MarketQA, POINT (Barclays), SSC

  • Operating Systems

MS DOS, Windows 95/98/NT/2000/2003/XP/7

  • RDMS and Database Tools

MS SQL 2008 R2, 2008, 2005 & 2000, Oracle 10G, MS-Access, Query Analyzer, Enterprise Manager, Management Studio, SQL Server Business Intelligence Development Studio, TOAD 10.6.1.3., SQL Developer

  • ETL Tools

SQL Server Integration Services (SSIS), DTS (Data transformation services) and in grown using T- SQL, Pentaho Data Integration 4.2.0-GA

  • Development skills

Transact SQL (T-SQL).

  • Reporting Tools

SQL Server Reporting Services (SSRS), Excel, SharePoint

  • Programming Languages

C, C++, C#, ADO.NET, ASP.NET, T-SQL, HTML, CSS

  • Database Modeling

MS Visio, Erwin

EDUCATION

  • Master of Information Systems

EXPERIENCE

Confidential, Jan 2011 – Current
Waltham, MA
Sr. SQL/ETL Developer
This was an ongoing project for developing the data warehouse, SMARTDEV, for the joint venture project of United Stage Postal Service and Imagitas Inc., a Pitney Bowes subsidiary company. For the change of address, USPS had introduced the online change of address, the idea suggested by the Imagitas Inc. which further resulted in the development of the portal site, mymove.com. MyMove is the portal site of Imagitas which linked with USPS site. When the user completes the change of address at USPS site he/she is directed to mymove.com where he gets offers of coupons to buy stuff from different companies which could help him in his new move. The main purpose of the project was to develop the data warehouse which on which reports could be developed. These reports will help figure out the revenue generated by the address changed by user and the websites visited by the users in order to subscribe the stuffs using coupons.
Responsibilities:

  • Actively involved in the data analysis of database architecture of both the USPS site, known as MGO and Imagitas Portal site MyMove (MM).
  • Participated along in removing the then existing defects resulted in the previous release.
  • Altered as well as created transformations using SQL Server Integration Service (SSIS) for data conversion and populating dimensions.
  • Updated the complex queries for the transformations used to populate the fact tables.
  • Optimized the complex queries by creating the indexes on the tables and removing the unnecessary joins and sub queries.
  • Automated the creation of the components of URLs by parsing them in the tables which was previously done manually using flat files source.
  • After getting done with the fixing of the defects, got involved in the second phase which was the data migration from the USPS site.
  • Participated in the design of logical and physical development of the data warehouse.
  • Designed the Snow flake schema using Microsoft Visio for creating STAGE database, ARCHDATA, Central Data Warehouse (CDW) and Data Mart (DM) to accommodate the data from USPS site.
  • Designed and Developed the ETL process to achieve the purpose of the development of the STAGE, ARCHDATA, CDW and DM.
  • Architect the ETL process for bringing the required data from ARCHDATA into dimension and fact tables in DW as per business requirements.
  • Created the basic SSIS Framework Technical documents consisting of Naming conventions for packages, transformations, connection managers, log files etc.
  • Created the base SSIS template package including all necessary components considering best practices like Basic Logging (both SQL server and text file logging), configuration, connection managers, pre execution, post execution etc. This package was used as base for the development by all developers. This package was created taking into account frequently used business requirements making the development process more rapid, modular and extensible.
  • Created Stored Procedures, functions, tables for tracking the package execution process (start time, end time, load information, suspended records, status etc.)
  • Gathered Business requirement for the MGO to support the reporting needs for the analytics team. Worked as a lead developer for the automated workflow process using SSIS packages which implemented Business Rules to validate data, integration with Dashboard to track the status of submission, implementation of business approval process via File system process and successful loading of data to Data warehouse.
  • Created approximately 30 packages including extraction from flat files, excel file, oracle, SharePoint list and load data to SQL server.
  • Implemented all types of control flow and data flow transformations in packages to handle different logics.
  • Created data Warehouse packages to load Slowly changing dimensions (SCD1 and SCD2) using Merge Statement as well as upsert in SSIS (Lookup and Merge Techniques)
  • Created approximately 40 SSIS packages to create Excel reports using VB.Net scripting in Script task to handle custom formatting of excel sheets, including logo etc). All the parameters could be handled dynamically from configuration table.
  • Created various stored procedures using dynamic SQL, advance T- SQL techniques like Common Table Expressions, Pivoting, grouping sets, ranking functions etc.
  • Used event handlers to send custom e-mail notifications to business users as well as technical teams.
  • With the help of the DBA, enabled the Change Data Capture (CDC) in SQL Server Management Studio to create the check point for the data recovery in case of package failure during the incremental or full load.
  • Created the query in Execute SQL task to retrieve the timestamp for the CDC in every package to provide the opportunity to run the package from the point it got failed or to update the table in case of any changes made from the specific time.
  • Created purge process package to delete old files depending upon the retention period. All parameters could be handled dynamically without making any changes in package by using configuration tables.
  • Used SharePoint web service to get data from SharePoint list and load to data warehouse after the data validation.
  • Create XSD and XML files used by SharePoint application from the SQL server database using dynamic SQL.
  • Created build scripts for code promotion to SIT/QA/PROD environments.
  • Created Test cases for deployment to SIT, QA and Production Environments.
  • Used Tortoise SVN on regular basis for version controlling of SSIS packages as well as various T-SQL objects.
  • Created run books for whole the SSIS development for the ease of Production Support team.
  • Accommodated the new transformations in the existing job which was previously loading the data for few tables from USPS site.
  • Updated the Master Job for the change made for MGO data and ran it in development environment.
  • After validating the data integrity and data conversion moved the transformations and jobs to the testing environment and made the final integrity test.
  • Following the decision from company\'s executive, participated with Architect and Project Manager to move ETL process from SSIS to Pentaho Data Integration.
  • Participated in installation and configuration of the PDI version 4.2.0 server and created the repositories in Development, Test and Production.
  • Created the connections in the Kettle properties for extracting the data from databases and excel sheet and flat file from Linux server.
  • Designed and developed the transformations in PDI based on the existed ones from SSIS.
  • Created the table for logging the basic and detailed package activities during the execution period.
  • Actively involved in architecting the SMART job for daily load into STAGE, ARCHDATA, Date Warehouse and Data Mart.
  • Moved the SMART job in Test environment along with all the transformations of STAGE, ARCHDATA and PROD.
  • After the successful testing for quite a few day in Test environment, moved SMART job to PROD stopping the SSIS jobs.
  • Log table was monitored very closely for about a week and data validation was done on regular basis.
  • After the successful deployment of the SMART jobs the regular user stories were followed to create reports using SQL Server Report Service.
  • Provided 2 months of Production Support which includes maintaining and documenting communication with business users on daily basis to resolve data related issues.

Environment:
MS SQL Server 2008/2008R2, T-SQL, SSIS, Win 2003, Win 2008 Server R2, Tortoise SVN 1.7.6, Oracle 10G, TOAD 10.6.1.3., SQL Developer, Oracle SQL, Pentaho Data Integration 4.2.0.

Confidential,NYC Mar 2008 – Dec 2010
MS SQL/ETL Developer
PineBridge Investments is one of the leading Assets Management Group. The main business of the companies include Listed Equities like emerging markets, Fixed Income and other alternative investments like Hedge funds etc. The project was to create and implement packages to extract data from various sources like SQL server, Excel files, SharePoint list, and flat files then load them to MAGIC Data Warehouse after various kinds of validations. As this was a new project so we started everything from scratch and lots of thought process and documentation was done using the best practices of SSIS development.
Responsibilities:

  • Worked with business users, data architect to identify the business requirements and worked in Rapid application development (RAD).
  • Involved in building Pinebridge data warehouse whose primary source of data comes from external vendors and corporate sources which includes the financial data like fixed income, bonds, time deposit etc…
  • Involved in designing the workflow process of the Assets under Management (AUM) file which would provide business with the cross investment of portfolio in various funds.
  • Worked with the files which contain the corporate actions for each business day for those financial products.
  • Involved with Interface builds to External applications BARRA, MarketQA, Factset and POINT (Barclays Fixed Income Attribution System).
  • Understanding the Customer Account Hierarchy with the business team and implementing better strategies for maintenance on back end.
  • Gathered Business requirement for the AUM workflow System to support the private Equity side of business. Worked as a lead developer for the automated workflow process using SSIS packages which implemented Business Rules to validate data, integration with Dashboard to track the status of submission, implementation of business approval process via File system process and successful loading of data to Data warehouse.
  • Developed SQL Packages, Procedures and Functions in accordance with Business Requirements for loading data into database tables.
  • Participated in database logical design to fit new business requirement, and implemented new design into SQL Server 2008R2.
  • Created the database structures with all the required tables along with the constraints and referential integrity in development and testing environment.
  • Ran the DDL and DML scripts for the creation and alteration of the tables in DEV and TEST environment.
  • Wrote complex queries to extract the data from source and used transformation like Insert Update, Database Lookup, Merge Join, Execute SQL Task etc. to perform the ETL task.
  • Created data warehouse packages to load Slowly changing dimensions (SCD1 and SCD2) using Merge Statement as well as upsert in SSIS (Lookup and Merge Techniques)
  • Created SSIS packages for processing fact and dimension tables with complex transforms and also configured and deployed SSIS packages to import and export data.
  • Validated the data mapping and data type from source to destination to ensure the data quality and smooth ETL process.
  • Created complex stored procedures for the adhoc reports as well as the weekly and monthly report.
  • Generated drill down and drill through reports from data warehouse using SQL Server Reporting Services (SSRS).
  • Involved in the database optimization by creating indexes on the fields used on the reports and partitioning the tables for expediting the execution of the queries.
  • Filtered bad data from legacy system using complex T-SQL statements, and implemented various constraint and triggers for data consistency.
  • Created and maintained database, tables, user logins, views, indexes, and constraints to implement business rules and also created Triggers to enforce data and referential integrity.
  • Created stored procedures for maintaining SQL Server and for Application Developers.
  • Optimized indexes with the use of Index Tuning Wizard and Rebuild indexes.
  • Used SQL Server Profiler to trace the slow running queries and the server activity.
  • Wrote tech spec document for all the activities involved to complete the process successfully.

Environment:
SQL Server 2008/2008R2, T-SQL, SSIS, SSRS, Win 2003, SVN, Microsoft Visio 2007, Microsoft Project 2007

Confidential, Sep 2007 – Feb 2008
Staten Island, NY
Database Developer
Professional Transcription Company (PTC) is related to the health care industry. PTC transcribes the dictations done by doctors. PTC holds the patient’s data of hospitals and clinics along with the admin people.
The project was about to develop the Data Warehouse for PTC for marketing purpose. In addition to the data warehouse development, perform performance tuning by rewriting stored procedures, functions and triggers for application developer for the existing database which supports the company’s website.
Responsibilities:

  • Performed logical and physical data modeling as well as assessing business rules.
  • Built and maintain stored procedures for Application Developers and user defined functions for business logic implementation.
  • Created Triggers to enforce data and referential integrity.
  • Created new user databases, database files, database file groups in SQL Server 2008 using SSMS and T-SQL.
  • Created and modified stored procedures, functions and triggers.
  • Extensively involved in the full lifecycle of ETL software development - analysis, design, build, documentation and testing for the project using.
  • Designed and developed SSIS packages, stored procedures, configuration files, tables, views, functions and implemented best practices to maintain optimal performance.
  • Generated production reports, parameterized reports and regular reports using the tools SSRS.
  • Used stored procedures, functions, tables, views and triggers and worked in designing and developing Reports, Data warehouses, Data marts and Business Intelligence using multi-dimensional models such as Star Schemas and Snow Flake Schemas.
  • Reviewed cube with end user and changed the cube and documentation according to their requirements.
  • Designed facts and dimension tables defined relation between facts and dimensions.
  • Created logical multi dimensional models and studied relationships among the logical objects.
  • Designed, administrated and developed Star Schema Data marts.
  • Objects moving between Development, Staging, Pre Production and Production.
  • Enforced referential integrity by the creation of constraints on database objects.
  • Developed database components using ADO.NET and VB.NET for data access.
  • Created complex SSAS cubes with multiple fact measures groups, and multiple dimension hierarchies based on the OLAP reporting needs.
  • Worked on DTS Packages, DTS Import/Export for transferring data from Heterogeneous Databases.
  • Maintained SQL Script for creation of Database Objects.
  • Monitored, Optimized and Performance Tuning of database using Performance monitor, SQL Profilers, Activity Monitor.

Environment:
SQL Server 2005/2008, SQL Server Integration Services (SSIS), SSAS, SSRS, MS Excel 2007, MS Access 2007, Enterprise Manager, Management Studio, MS SQL Query Analyzer and SQL Profiler.

Confidential, Feb 2006 – Aug 2007
Oaks, PA
Data Warehouse Developer

SEI provides investment processing and fund processing services to help Money managers better operate their business. SEI supports a full range of investment products including mutual funds, hedge funds, and private equity funds. SEI also supports the operations for separately-managed account offerings. The project involved to move the funds data from production System to Decision Support System (DSS). Maintain the Historical data as well as populate the Dimension and Fact tables in DSS. Responsibilities:
  • Develop SSIS Package to move the data from Production to DSS.
  • Involve in Business requirement gathering, Technical Design Documents, Business use cases and Data mapping.
  • Developed database SSIS Packages, Tables, Triggers, and Indexes using T-SQL, SQL Analyzer and Enterprise Manager.
  • Configure the SSIS Package for run time Parameters and Configuration file.
  • Develop the Documents for Logging/Error Handling for SSIS Packages.
  • Handle the orphan rows in Dimension tables.
  • Write the Stored Procedure to calculate Ranks of the manager.
  • Write the Stored procedure to calculate net return and Gross return of the Funds.
  • Write the Functions to convert the funds into Single Currency.
  • Perform T-SQL tuning and optimizing queries for Reports which take longer time in execution
  • Developed interface and reporting Stored Procedures, Views, and UDF for efficient data storage and manipulation.
  • Develop the Slowly Changing Transformation to maintain the Historical Data.
  • Develop Derived Column transformation to calculate Mangers and Index Returns.
  • Develop the SSRS Reports for Money managers.
Environment: SQL Server 2005, SQL Server Integration Services (SSIS), MS Excel 2003, MS Access 2003, Enterprise Manager, Management Studio, MS SQL Query Analyzer, SQL Profiler and SSRS.

Hire Now