We provide IT Staff Augmentation Services!

Sr Etl Developer/data Analyst Resume

3.00/5 (Submit Your Rating)

Bridgewater New, JerseY

SUMMARY

  • Seven (7) years of IT experience in the Analysis, Design, Development, Testing and Implementation of business application systems for Financial Sectors using Data Warehouse/Data Mart Design, ETL, OLAP, BI, Client/Server applications.
  • Certified professional in Java and C Programming Languages
  • Certified Informatica Powercenter Developer.
  • Extensive experience developing Teradata SQL, Stored Procedures, BTeq.
  • Extensively worked with Teradata utilities likeBTEQ, Fast Export, Fast Load, Multi Loadto export and load data to/from different source systems including flat files.
  • Hands on experience using query tools like TOAD, SQL Developer, PLSQL developer, TeradataSQL Assistantand Query man.
  • Experience in all aspects of Software development life cycle (SDLC) viz. design, analysis, development, implementation, testing and support.
  • Created warehousing solutions using the Relational On - Line Analytical Processing (ROLAP) approach
  • Strong experience in ETL workflows for multiple data sources like flat files, XML, Teradata, Oracle, SQL Server.
  • Experience with tools like SQL Server management studio and SQL Server 2005/2008 integration (SSIS) and reporting services(SSRS).
  • Worked extensively in building Dimensions, Bridges, Facts, and Star Schemas, Snow Flake (Extended Star) Schemas and Galaxy Schemas
  • Experienced in up gradation and migration of OBIEE between dev/test/prod environments.
  • Proficient in Installation, configuration, and administration of the OBIEE platform
  • Extensive experience in creating executive reports with BI Publisher integrated with OBIEE.
  • Experienced in configuring and set up of OBIEE Security using LDAP and External Database Tables and configuring object level and database level security.
  • Experienced in QlikView Server and Publisher maintenance - Creating scheduled jobs for QVD extracts and report reloads.
  • Worked as a QlikView Technical Consultant for a wide variety of business applications.
  • Built data load regulation programs with inherent ability to restart on failure using UNIX shell scripting
  • Experience in using Automation Scheduling tools like Autosys and Control-M.
  • Hands on experience in installing configuring and using Hadoop ecosystem components like Hadoop MapReduce HDFS HBase Hive Sqoop Pig.
  • Good Exposure on Apache Hadoop Map Reduce programming PIG Scripting and Distribute Application and HDFS.
  • Implemented in setting up standards and processes for Hadoop based application design and implementation.
  • Experienced in Oracle and developed ETL code using Oracle PL/SQL.
  • Optimized Oracle stored procedures and SQLs for better performance

TECHNICAL SKILLS

Certification: Java, C Programming, PowerCenter Developer

Operating Systems: Windows 7, Windows XP, Unix, Linux, MS-DOS

BI/ETL Tools: Informatica Power Center 10.1/9.1/8.6/8.5/8.1

RDBMS: Oracle 11g/10g/9i, Teradata 15.10, MS SQL Server 2008, DB2 v8.1

Data Modelling Tools: Erwin, MS Visio

OLAP Tools: Cognos 8.4, Business Objects XI r2/6.x, OBIEE 10.1.3.4, Qlikview 12.10

Scheduling Tools: Autosys, Control-M

Languages: SQL, PL/SQL, UNIX, Shell scripts, C, Java, HTML, JavaScript

Big Data Ecosystems: Hadoop, MapReduce, HDFS, HBase, Hive, Pig, Sqoop Functional Expertise Treasury, Financial Services, Banking

Project Management Tools: Planview, MS Project, MS Notes, Service Now

PROFESSIONAL EXPERIENCE

Confidential, Bridgewater, New Jersey

Sr ETL Developer/Data Analyst

Responsibilities:

  • Worked on Informatica Power Center tools- Designer, Repository Manager, Workflow Manager, and Workflow Monitor.
  • Parsed high-level design specification to simple ETL coding and mapping standards.
  • Involved in building the ETL architecture and Source to Target mapping to load data into Data warehouse.
  • Developed metadata repository using OBIEE Administration tool in Physical, Business Model and Mapping, and Presentation Layer.
  • Created security settings in OBIEE Administration Tool and set up groups, access privileges and query privileges and also managed security for groups in Answers.
  • Integrated BI Publisher with OBIEE to build reports in word, excel and doc format.
  • Involved in Dimensional modeling (Star Schema) of the Data warehouse and used Erwin to design the business process, dimensions and measured facts.
  • Extracted the data from the flat files and other RDBMS databases into staging area and populated onto Data warehouse.
  • Developed SQL Server Stored Procedures, Tuned SQL Queries( using Indexes and Execution Plan)
  • Four dashboards were created, along with helping other team to create Qlikview reports.
  • Worked with QlikView 12.x, SQL Server, Oracle, Netezza, Excel, CSV files.
  • Developed dashboards pertaining to KPI monitoring using QlikView12.x.
  • Enabled speedy reviews and first mover advantages by using Oozie to automate data loading into the Hadoop Distributed File System and PIG to pre-process the data
  • Developed performance utilization charts, optimized and tuned SQL and designed physical databases. Assisted developers with Teradata load utilities and SQL.
  • Created tables, views in Teradata, according to the requirements.
  • Used Teradata utilities like MultiLoad, Tpump, and Fast Load to load data into Teradata data warehouse from Oracle and DB2 databases.
  • Loaded the data into the Teradata database using Load utilities like (Fast Export, Fast Load, MultiLoad, and Tpump).
  • Developed MapReduce programs to parse the raw data, populate staging tables and store the refined data in partitioned tables in the EDW.
  • Shared responsibility for administration of Hadoop, Hive and Pig.
  • Solved Various Defects in Set of wrapper Scripts which executed the Teradata BTEQ, MLOAD, FLOAD utilities and Unix Shell scripts.
  • Performance tuning of Oracle Databases and User applications.
  • Assisted in Batch processes using Fast Load, BTEQ, UNIX Shell and Teradata SQL to transfer cleanup and summarize data.
  • Designed and developed OLAP Cubes and Dimensions using SQL Server Analysis Services (SSAS).
  • Developed MLOAD scripts to load data from Load Ready Files to Teradata Warehouse.
  • Extract Transform Load (ETL) development using SQL Server 2005, SQL 2008 Integration Services (SSIS).
  • Creating source and target table definitions using SSIS. Source data was extracted from Flat files, SQL Server and DB2 Database
  • Wrote UNIX shell Scripts & PMCMD commands for FTP of files from remote server and backup of repository and folder.

Technology: Windows 7, Unix, Informatica Power Center 10.1/9.1, Oracle 11gTeradata 15.10, MS SQL Server 2008, DB2 v8.1, Erwin, OBIEE 10.1.3.4, Qlikview 12.10Autosys, SQL, PL/SQL, UNIX, Shell scripts, Hadoop, MapReduce, HDFS, HBase, Hive, PigSqoop

Confidential, Stamford, Connecticut

ETL Lead/Developer

Responsibilities:

  • Involved in Design, analysis, Implementation, Testing and support of ETL processes for Stage, ODS and Mart.
  • Developed processes on both Teradata and Oracle using shell scripting and RDBMS utilities such as Multi Load, Fast Load, Fast Export, BTEQ (Teradata) and SQL*Plus, SQL*Loader (Oracle).
  • Created/Enhanced Teradata Stored Procedures to generate automated testing SQLs.
  • Involved in Data mining through Teradata miner.
  • The Teradata EXPLAIN facility, which describes to end-users how the database system will perform any request.
  • Created Hierarchies, Levels, and implemented Business Logic by creating level based measures in OBIEE business model & mapping layer.
  • Created Security settings in OBIEE Administration Tool to set up groups, access privileges and query privileges and also managed security for groups in Answers.
  • Configured and created repository using OBIEE Administration Tool. The TS/API product, a system to allow products designed for SQL/DS to access the Teradata database machine without modification.
  • Coding using BTEQ SQL of TERADATA, write Unix scripts to validate, format and execute the • SQL's on UNIX environment.
  • Prepared ETL standards, Naming conventions and wrote ETL flow documentation for Stage, ODS and Mart.
  • Responsible for positioning and delivering QlikView projects and cross building applications into new and existing customer base.
  • Worked as Subject Matter Expert (SME) for various projects. Taught the users to get along with Qlikview.
  • Performance Tuning in SQL Server 2008 using SQL Profiler and Data Loading.
  • Designed and developed Informatica Mappings and Sessions based on business user requirements and business rules to load data from source flat files and oracle tables to target tables.
  • Used debugger to debug mappings to gain troubleshooting information about data and error conditions.
  • Installing SQL Server Client side utilities and tools for all the front-end developers/programmers.
  • Used Change Data Capture (CDC) to simplify ETL in data warehouse applications.
  • Involved in writing procedures, functions in PL/SQL.
  • Developed mappings in Informatica using BAPI and ABAP function calls in SAP.
  • Used the Remote functional call RFC as the SAP interface for communication between systems
  • Implemented RFCs for the caller and the called functions module for running in the same sytem.
  • Involved in extensive performance tuning by determining bottlenecks at various points like targets, sources, mappings, sessions or system. This led to better session performance.
  • Administrator for Pig, Hive and Hbase installing updates, patches and upgrades.
  • Collaborated with the infrastructure, network, database, application and BI teams to ensure data quality and availability.
  • Extract Transform Load (ETL) development using SQL Server 2005, SQL 2008 Integration Services (SSIS).
  • Created reports for the BI team using Sqoop to export data into HDFS and Hive.
  • Installed and configured MapReduce, HIVE and the HDFS; implemented CDH3 Hadoop cluster on CentOS. Assisted with performance tuning and monitoring.
  • Experience with Performance Tuning for Oracle RDBMS using Explain Plan and HINTS
  • Enhancing and deploying the SSIS Packages from development server to production server.
  • Worked with SQL*Loader tool to load the bulk data into Database.
  • Prepared UNIX Shell Scripts and these shell scripts will be scheduled in AUTOSYS for automatic execution at the specific timings.

Technology: Windows 7, Linux, Informatica Power Center 9.1, Oracle 11g, Teradata 15.10, MS SQL Server 2008, DB2 v8.1, Erwin, MS Visio, OBIEE 10.1.3.4, Qlikview 12.10, Autosysm, SQL, PL/SQL, UNIX, Shell scripts, Hadoop, MapReduce, HDFS, HBase, Hive, Pig, Sqoop

Confidential, Stamford, Connecticut

Sr ETL Developer

Responsibilities:

  • Wrote conversion scripts using SQL, PL/SQL, stored procedures, functions and packages to migrate data from SQL server database to Oracle database.
  • Using Teradata manager, Index Wizard and PMON utilities to improve performance.
  • Populate or refresh Teradata tables using Fast load, Multi load & Fast export utilities for user Acceptance testing and loading history data into Teradata
  • Worked on DTS/SSIS for transferring data from Heterogeneous Database (Access database and xml format data) to SQL Server.
  • Involved in Data Integration by identifying the information needs within and across functional areas of an enterprise database upgrade and Migration with SQL server Export Utility.
  • Worked on the Reports module of the project as a developer on MS SQL Server 2005 (using SSRS, T-SQL, scripts, stored procedures and views).
  • Responsible for developing, support and maintenance for the ETL (Extract, Transform and Load) processes using Informatica Power Center 8.5.
  • Experience in integration of heterogeneous data sources like Oracle, DB2, SQL Server and Flat Files (Fixed & delimited) into Staging Area.
  • Wrote SQL-Overrides and used filter conditions in source qualifier thereby improving the performance of the mapping.
  • Developed many Reports / Dashboards with different Analytics Views (Drill-Down, Pivot Table, Chart, Column Selector, and Tabular with global and local Filters) using OBIEE.
  • Reduced Teradata space used by optimizing tables - adding compression where appropriate and ensuring optimum column definitions.
  • Used OBIEE Web Catalog to set up groups, access privileges and query privileges.
  • The data that was obtained from various sources were fed into the staging area in Teradata.
  • Involved in loading of data into Teradata from legacy systems and flat files using complex MLOAD scripts and Fast Load.
  • Extracted data from Oracle database transformed and loaded into Teradata database according to the specifications.
  • Managed the Metadata associated with the ETL processes used to populate the Data Warehouse.
  • Developed multiple MapReduce jobs in Java for data cleaning and preprocessing.
  • Created reports for the BI team using Sqoop to export data into HDFS and Hive.
  • Implemented performance tuning of Sources, Targets, Mappings and Sessions by identifying bottlenecks and used Debugger to debug the complex mappings and fix them.
  • Migrated SSIS Packages from SQL Server 2005 to SQL Server 2008.
  • Managed and reviewed Hadoop log files.
  • Extract, transform, and load data from multiple data sources into the QlikView application
  • The project initially used Tableau but migrated to Qlikview in the 2nd week of development
  • Implemented 11g and upgraded the existing database from Oracle 10g to Oracle 11g.
  • Designed, created, and maintained DB2 database objects
  • Design, Build, Test, Debug, Monitor, and Troubleshoot QlikView solutions
  • Developed a Conceptual model using Erwin based on requirements analysis
  • Involved in writing Teradata SQL bulk programs and in Performance tuning activities for Teradata SQL statements using Teradata EXPLAIN
  • Audited application SQL code with DB2 Explain prior to production implementation
  • Used PMCMD command to automate the Power Center sessions and workflows through UNIX.

Technology: Windows 7, Linux, Informatica Power Center 9.1/8.6, Oracle 11g/10g, Teradata 14.1, MS SQL Server 2008, DB2 v8.1, Erwin, MS Visio, Cognos 8.4, Business Objects XI r2/6.x, OBIEE 10.1.3.4, Qlikview 12.10, Autosys, Control-M, SQL, PL/SQL, UNIX, Shell scripts, Java, JavaScript, Hadoop, MapReduce, HDFS, HBase, Hive, PigSqoop

Confidential, Stamford, Connecticut

Sr. ETL Developer

Responsibilities:

  • Collected requirements from business, studied the same extensively; mapped subject areas onto dimensions, identified facts from the Key Performance Indexes (KPIs) and formulated conformed dimensions using Dimensional Modeling.
  • Created packages using SSIS for data extraction from Flat Files, Excel Files, and OLEDB to SQL Server.
  • Performance tuning of Oracle Databases and User applications.
  • Modeled multiple data marts within the warehouse and articulated respective Star/Snow-Flake Schemas.
  • Analyzed and performed tuning on IMS and DB2 databases which resulted in reduced dasd and tape costs of up $70,000 annually
  • Created Informatica transformations/mapplets/mappings/tasks/worklets/workflows to load the data from source to stage, stage to dimensions, bridges, facts, summary and snapshot facts.
  • Designed and developed OLAP Cubes and Dimensions using SQL Server Analysis Services (SSAS).
  • Used External Loaders like Multi Load, T Pump and Fast Load to load data into Teradata database.
  • Loaded data into Teradata using DataStage, FastLoad, BTEQ, Fast Export, MultiLoad, and Korn shell scripts.
  • Analyzed business requirements, transformed data, and mapped source data using the Teradata Financial Services Logical Data Model tool, from the source system to the Teradata Physical Data Model.
  • Installed, implemented, and trained staff in the use of Platinum DB2 tools, Candle Monitor for DB2, and DB2 tuning techniques.
  • Assisted in Batch processes using Fast Load, BTEQ, UNIX Shell and Teradata SQL to transfer cleanup and summarize data.
  • Made use of various Designer transformations like Source Qualifier, Connected and Unconnected Lookups, Expression, Filter, Router, Sorter, Aggregator, Joiner, Normalizer, Rank, Router, Sequence, Union and Update Strategy transformations while creating mapplets/mappings.
  • Used Model Mart of ERwin for effective model management of sharing, dividing and reusing model information and design for productivity improvement.
  • Used DTS Packages as ETL tool for migrating Data from SQL Server 2000 to Oracle 10g
  • Extensively used the Workflow Manager tasks like Session, Event-Wait, Timer, Command, Decision, Control and E-mail while creating worklets/workflows.
  • Fine-tuned procedures/SQL queries for maximum efficiency in various databases using Oracle Hints, for Rule based optimization.
  • Contributed toward Informatica upgrade from version 8.6.1 to 9.1.0.
  • Developed Oracle PL/SQL code to write DDL/DML statements and Stored Procedures for data transformation and manipulation.
  • Created UNIX shell scripts to FTP and cleanse source data files and archive data and log files.
  • Worked on Teradata Query man to validate the data in warehouse for sanity check.
  • Sequenced the different ETL components (Informatica workflows/Oracle SQLs/UNIX scripts) and enlisted in a list file; further created a data load regulation program by writing a parent UNIX shell script that executes the components within the list file sequentially.
  • Invoked Informatica using “pmcmd” utility from the UNIX script and Oracle using “sqlplus”.
  • Incorporated the mechanism to restart the load in the main program itself by logging status to ensure smooth resumption in case of any failures during the load.
  • Created modules and chains to schedule the loads using UC4 Applications Manager.
  • Created and maintained DDL for all DB2 databases.
  • Successfully deployed data marts (DM) like Prospecting/Sales DM, Finance DM, Risk DM, Originations DM, and Customer Analytics DM to constitute the entire warehouse.
  • Performed SYSADM and Production DBA function for all DB2 subsystems and applications. Resulted in a stable environment for a company with a 24X7 uptime requirement.
  • During the course of the project, participated in multiple meetings with the client to propose better strategies for performance improvement and gather new requirements.
  • Provided support for the applications after production deployment to take care of any post-deployment issues.

Technology: Windows 7, Unix, Informatica Power Center 8.6/8.5/8.1, Oracle 11g/10g, Teradata 14.1, MS SQL Server 2008, Cognos 8.4, Business Objects XI r2/6.x, Autosys, Control-M, SQL, PL/SQL, UNIX, Shell scripts

Confidential, Stamford, Connecticut

ETL Developer

Responsibilities:

  • Interact with the DA and Reporting teams to understand business and functional rules to put in the ETL design and close gaps in model/mappings
  • Creation of technical design documents (High level and low level) for proposed ETL build
  • Development of ETL stage design
  • Development of PL/SQL procedures/packages
  • Development of Informatica mapping build based on finalized ETL design
  • Identified and tuned ETL performance issues
  • Monitored and maintained ETL Jobs in multiple database environments
  • Development of script, error handling strategy and implementation and build sessions and worklets and workflow
  • Documentation of Informatica mappings in Excel spread sheet.
  • Developed and reviewed ETL workflows for multiple data sources like flat files, XML, Teradata, DB2, Oracle, SQL Server.
  • Writing of SQL Scripts and PL/SQL Scripts to extract data from Database and for Testing Purposes.
  • Developed mappings in Informatica using BAPI/ABAP/RFC function calls in SAP.
  • Understand how the IDOC works in SAP, Analyse the functional and technical documents in SAP to implement them in Informatica using Power Connect
  • Used Oracle JDeveloper to support JAVA, JSP and HTML codes used in modules.
  • Created/Enhanced Teradata Stored Procedures to generate automated testing SQLs
  • Managed all indexing, debugging, optimization and performance tuning using SQL Profile and SQL Server Agent.
  • Writing UNIX shell scripts to customize ftp/scheduling jobs.
  • Worked extensively on Erwin and ER Studio in several projects in both OLAP and OLTP applications.
  • Populate or refresh Teradata tables using Fast load, Multi load & Fast export utilities for user Acceptance testing and loading history data into Teradata.
  • Prepared UNIX Shell Scripts and these shell scripts will be scheduled in AUTOSYS for automatic execution at the specific timings.
  • Unit test Plan and reconcile test results
  • Co-ordinate with QA team to plan and execute QA testing and resolve any defects
  • Co-ordinate with deployment and support resources to ensure that solutions gets implemented smoothly into QA and production environments
  • User support during application stabilization.
  • Handover to support team.
  • Perform project management and status reporting related activities for onsite/offsite project team.
  • Created Dashboard using Denodo to give quick view of reported trades

Technology: Windows XP, Unix, Informatica Power Center 8.6/8.5/8.1, Oracle 10g, Teradata 13.1, MS SQL Server 2008, Business Objects XI r2/6.x, Autosys, SQL, PL/SQL, UNIX, Shell scripts, Java, HTML, JavaScript

Confidential

Intern

Responsibilities:

  • Created Dimension Tables and Fact Tables based on the warehouse design.
  • Wrote Triggers and Stored Procedures using PL/SQL for Incremental updates
  • Integrated various sources in to the Staging area in Data warehouse to Integrate and for Cleansing data.
  • Developed Mappings and Workflows as per the requirements.
  • Created the Source and Target Definitions in Informatica Power Center Designer
  • Upgraded Oracle 9i to 10g software in different environments for latest features and also tested databases.
  • Created Reusable Transformations and Mapplets to use in Multiple Mappings
  • Developed and updated documentation of processes and system.
  • Migrated mappings from Development to Testing and from Testing to Production.
  • Created various Documents such as Source-to-Target Data Mapping Document, and Unit Test Cases Document.

Technology: Windows XP, Unix, Informatica Power Center 8.5/8.1, Oracle 10g/9i, MS SQL Server 2008/2005, Business Objects XI r2/6.x, SQL, PL/SQL, UNIX, Shell scripts, C, Java, HTML, JavaScript

We'd love your feedback!