We provide IT Staff Augmentation Services!

Data Analyst Resume

3.00/5 (Submit Your Rating)

Orange County, CA

SUMMARY

  • Over 7 years of extensive experience in data migration, data warehousing, database design and manual testing.
  • Extensive experience with ETL/Data Warehousing tools in Financial, Communication, Utilities, Beverages and Retail Industry.
  • Involved in various stages of Software development life cycle.
  • Proficient in converting logical data models to physical database designs in Data warehousing Environment and in-depth understanding in Database Hierarchy, Data Integrity concepts.
  • Skilled inData Warehousing Data Modeling using Star Schema and Snowflake Schema.
  • Good knowledge of Teradata RDBMS Architecture, Tools & Utilities
  • Sourced data from disparate sources like Mainframe Z/OS, UNIX flat files, IBM DB2, Oracle, and SQL Server and loaded into Oracle and Teradata DW.
  • Extracted source data from Mainframe Z/OS using JCL scripts and SQLs into Unix Environment and Created a formatted reports for Business Users using BTEQ scripts
  • Expertise in Report formatting, Batch processing,Data Loading and Export using BTEQ.
  • Did the performance tuning of user queries by analyzing the explain plans, recreating the user driver tables by right primary Index, scheduled collection of statistics, secondary or various join indexes
  • Well versed with Teradata Analyst Pack including Statistics Wizard, Index Wizard and Visual Explain.
  • Developed UNIX shell scripts and used BTEQ, FastLoad, Multiload, and Fast Export utilities extensively to load to target database.
  • Skillfully used OLAP analytical power of Teradata by using OLAP functions such as Rank, Quantile, Csum, MSum, group by grouping set etc to generate detail reports for marketing folks.
  • Used extensively Derived Tables, Volatile Table and GTT tables in many of the BTEQ scripts.
  • Involved in designing and building stored procedures and macros for the module.
  • Expertise in Oracle 9i/8.x/7.x, SQL/PLSQL, Procedures, Functions, Database Packages and Triggers. And experience in trouble shooting techniques, tuning the SQL statements, Query Optimization and Dynamic SQL.

EDUCATION: Bachelor of Technology

TECHNICAL SKILLS

  • Operating Systems:Windows XP, 2000, UNIX/Linux
  • Database: Teradata 12.0, V2R5/V2R6.0, Oracle (8i/9i/10g), SQLServer2000, DB2
  • Teradata tools &Utilities query facilities:SQLAssistant, BTEQ
  • Load & Export:FastLoad, MultiLoad, FastExport, BTEQ, OLE load, OracleSQL Loader
  • ETL:Abinitio 2.15,Informatica 8.1, Datastage7.5.
  • Languages:C, C++, Java, Visual Basic, ASP.Net,perl, JCL
  • Scriptinglanguages:UNIXShell, JCL, Java Script, VB Script, Perl
  • Others:Business Objects, Cognos, Microstrategy, Tableau

EXPERIENCE

Confidential,Orange County, CA Feb 2012 – Feb 2013
Sr.Teradata Developer/Data Analyst
In the face of mounting energy challenges, Southern California Edison (SCE) is pioneering a new energy future with Edison SmartConnect, the nation\'s most advanced smart metering system. Edison SmartConnect has redefined the industry and continues to lead program management, and successful application of best smart metering practices.The technology used will enable customers to make smarter energy choices, saving them both energy and money.
The Global Benefits of Edison SmartConnect provide a foundation for numerous energy, money, and resource saving technologies and benefits, which include:

  • Edison SmartConnect enables customers to control energy use. Controlling energy use results in energy conservation and reduces the need for emergency generation.
  • Edison SmartConnect provides SCE the ability to rapidly respond to peak loads.
  • Edison SmartConnect customers can view the previous day’s energy usage, historical energy usage comparisons, and program and rate information at sce.com.
  • Edison SmartConnect allows for remote smart meter data collection and responses to routine requests like turn offs/ons, response to theft/tampering.

Responsibilities

  • Involved in Businessrequirement gathering,Technical Design Documents, Business use casesandData mapping.
  • Involved in creating Teradata tables, Views and Macros using feedbacks from the business users.
  • Design of Mapping Sheets for the Complex Views and Business Views for build of Business views.
  • Frequently used Teradata Utilities such as BTEQ and Mload for loading small amounts of data and Fastload for loading huge amounts of the data into the tables.
  • Extensively involved in Query Tuning to improve the performance by optimizing high volume tables in Teradata using Join Indexes, Secondary Indexes, Partition Primary Indexes, table compression, collecting stats etc.
  • Worked on Data Analysis, Data Profiling, Mapping, Identify Data Quality issues, Data Cleaning, Data validationand Loading.
  • Developed Complex Weekly and Monthly reports using advanced SQL and properly documented the scripts based on company standards.
  • Involved in writing complex SQL code using SQL Assistant and BTEQ to generate Ad-hoc reports for the business team.
  • Written and implemented Teradata stored procedures to implement business logic.
  • Frequently used different Aggregate and OLAP functions often with multiple derived tables.
  • Created graphical representation of reports such as Bar charts, Line Charts as per Business requirements.
  • Create &Perform Unit & Integration Test Cases, UAT Test Case Review, and Turn over to Production.
  • Active interaction with business users, participating in onshore offshore conferences with the client, downstream business users and other vendors under consideration.

Environment: NCR 4800/5100, Teradata RDBMS, Teradata V13 (BTEQ, FastLoad, MultiLoad, FastExport, Teradata SQL Assistant)

Confidential,Milwaukee, WI Jan 2011 – Feb 2012

Sr Teradata Developer/Analyst

This project deals with the development of a new Beer ordering tool called DRIVE for a set of pilot distributors who are part of the Miller Coors supply chain. The legacy systems of both Miller and Coors were integrated into one Supply chain process. The data in the legacy systems was transformed into the current existing set of data for all the pilot distributors. Further in the next release this tool would be rolled over to a couple of hundreds of distributors

Responsibilities

  • Involved in Requirement gathering, business Analysis, Design and Development, testing and implementation of business rules.
  • Developed mappings to load data from Source systems like oracle, AS400 to Data Warehouse.
  • Development of scripts for loading the data into the base tables in EDW using FastLoad, MultiLoad and BTEQ utilities of Teradata.
  • Dealt with Incremental data as well Migration data to load into the Teradata.
  • Involved in Designing the ETL process to Extract translates and load data from OLTP Oracle database system to Teradata data warehouse.
  • Worked efficiently on Teradata Parallel Transport and generated codes.
  • Generated custom JCL scripts for processing all mainframes flat files, IBM DB2.
  • Developed performance utilization charts, optimized and tuned SQL and designed physical databases with Mainframes/MVS COBOL, Teradata load utilities, SQL.
  • Extracted source data from mainframe OLTP systems by writing COBOL and JCL scripts.
  • Responsible for trouble shooting, identifying and resolving data problems.
  • Created proper PI taking into consideration both planned access and even distribution of data across all the available AMPS.
  • Loaded and transferred large data from different databases into Teradata using MLoad and OLELoad.
  • Created series of Teradata Macros for various applications in Teradata SQL Assistant.
  • Involved in writing complex SQL queries based on the given requirements and for various business tickets to be handled.
  • Created Teradata models in Erwin.
  • Performance tuning for Teradata SQL statements using Teradata Explain command.
  • Created several SQL queries and created several reports using the above data mart for UAT and user reports.
  • Organized the data efficiently in the Teradata system using Teradata Manufacturing Logical Data Model.
  • Worked efficiently on Teradata Parallel Transport codes.
  • Used several of SQL features such as GROUP BY, ROLLUP, CASE, UNION, Sub queries, EXISTS, COALESCE, NULL etc.
  • Generated Configuration files, DML files, XFR files specifies the Record format, which are used in components for building graphs in Ab Initio.
  • Involved in creating Flat files using dataset components like Input file, Output file, Intermediate file in Ab Initio graphs.
  • Worked on enhancements of BTEQ scripts and Ab initio graphs which validated the Performance tables in the Teradata environment.
  • Developed graphs using multistage components.
  • Extensively Used Transform Components: Reformat, Rollup and Scan Components.
  • Implemented the component level, pipeline and Data parallelism in Ab Initio for ETL process for Data warehouse.
  • Extensively used Partitioning Components like Broad Cast, partition by key, partition by Range, partition by round robin and Departition components like Concatenate, Gather andMerge in Ab Initio.
  • Responsible for the automation of Ab Initio graphs using korn shell scripts.
  • Developed Ab Initio scripts for data conditioning, transformation, validation and loading.

Environment:NCR 4800/5100,Teradata RDBMS, Teradata V12 (BTEQ, FastLoad, MultiLoad, FastExport, Teradata SQL Assistant,Microsoft SQL Server 2008) UNIX Shell-scripting

Confidential,San Francisco, CA Sep2009– Jan 2011

Senior Teradata Developer/Technical Analyst

With a vast experience as Teradata developer, it was a unique experience to work for one of the premierfinancial companyWells Fargo. The customer database is used in driving, analyzing and execution of marketing programs to improve customer retention and increase revenue. The data warehouse mandate includes logical and physical database design and implementation, business intelligence application development and rollout, and enterprise-wide information and reporting applications. Focused on developingtools and database features that facilitate query-plan analysis and performance tuning.

Responsibilities

  • Worked on loading of data from several flat files sources using Teradata FastLoad and MultiLoad.
  • WritingTeradata sql queries to join or any modifications in the table.
  • Transfer of large volumes of data using Teradata FastLoad, MultiLoad and T-Pump.
  • Database-to-Database transfer of data (Minimum transformations) using ETL(Ab Initio).
  • Fined tuned the existing mappings and achieved increased performance and reduced load times for faster user query performance.
  • Used Trillium for Data Quality checking, data cleansing, data standardization, and matching(Converter,Parser,Geocoder,Matcher)
  • Name parsing and standardization performed through Trillium supplied tables built at Harte Hank’s data processing center
  • Address parsing and correction through Trillium postal tables and geocoding modules
  • Developed graphs for the ETL processes using Join, Rollup, Scan, Normalize, Denormalize and Reformat transform components as well as Partition and Departition components extensively.
  • Implemented Lookup’s, lookup_local, In-Memory Joins and rollup’s to speed up various Ab Initio Graphs.
  • Quickly adapted to Capital One Agile methodology (3 week sprints) and actively participated in Sprint Planning sessions.
  • Creation of customized Mload scripts on UNIX platform for Teradata loads using Ab Initio.
  • Sorted data files using UNIX Shell scripting.
  • Fine tuning of Mload scripts considering the number of loads scheduled and volumes of load data.
  • Used data profiler in ETL processes, Data integrators to ensure the requirements of the clients including checks on column property, column value, and referential integrity.
  • Acted as a single resource with sole responsibility of Ab Initio – Teradata conversions.
  • Written scripts to extract the data from Oracle and load into Teradata.
  • Worked datasets for analysis
  • Worked with SAS on exporting data using Teradata FastExport.
  • Written Teradata BTEQ scripts to implement the business logic.
  • Hands on with Teradata Queryman to interface with theTeradata.
  • Used SQL Profiler for trouble shooting, monitoring, optimizing SQL Server From developers and testers.
  • Used UNIX scripts to access Teradata& Oracle Data
  • Developed UNIX shell scripts for data manipulation
  • Involved in writing proactive data audit scripts.
  • Involved in writing data quality scripts for new market integration
  • Developed complex transformation code for derived duration fields.
  • Developed BTEQ scripts to extract data from the detail tables for reporting requirements.

Environment: NCR 4800/5100, TeradataV12 (BTEQ, FastLoad, MultiLoad, TeradataSQL, FastExport) Mainframes MVS/OS, JCL, TSO/ISPF, COBOL, DB2, SAS, Oracle, Ab Initio (GDE 1.15, Co>OS: V1.15) EME, Trillium

Confidential,Atlanta, GAFeb 2008 – August2009

Teradata Developer

This project deals with the transactions at regional and Country level. They follow two systems Advantage and E-Advantage. Advantage Application captures data from Flat files and ExcelSheets, where as E-Advantage Application pumps huge data into the production database, which makes it impossible to maintain beyond 90 days. Each location at the end of the day has to transfer the data into central zones and each location daily generates reports for different purposes like Risk portfolio, index fund portfolio. They needed extensive Data Warehousing to maintain historical data at a central location for integration and analyze the business information in different locations according to the profit areas, which could serve the purpose of a DSS management.

Responsibilities

  • Involved in database design/preparing SQL scripts to support the larger databases that involves terabytes of data.
  • Coordinated with the database team to create the necessary data sources for PSG (Premier Services) and FA (Financial Accounts) using Metadata Utility.
  • Involved in the design of complex campaigns for the Business users to accomplish different marketing strategies.
  • Coordinated with the test team in the design of test cases and preparation of test data to work with different Channels, setup regency and timeout for the Campaigns.
  • Involved in running the batch process for Teradata CRM.
  • Creation of BTEQ, Fast export, MultiLoad, TPump, Fast load scripts.
  • Worked on complex queries to map the data as per the requirements.
  • Extracted data from various production databases SAS, SYBASE, and Teradata to meet business report needs.
  • Developed Ab Initio XFR’s to derive new fields and solve various business requirements.
  • Developed number of Ab Initio Graphs based on business requirements using various Ab Initio Components such as Partition by Key, Partition by round robin, reformat, rollup, join, scan, normalize, gather, Broadcast, merge etc.
  • Worked on improving the performance of Ab Initio graphs by using various Ab Initio performance technique’s like using looks instead of Join’s etc.
  • Designed and implemented stored procedures and triggers for automating tasks in SQL.
  • Worked on some critical problems like booked metrics and solved them successfully using SAS/SQL
  • Interacted with technical and business analyst, operation analyst to resolve data issues
  • Analyze the current data movement (ETL (Informatica)) process and procedures.
  • Used Data profiler to allow the analysis of data directly in the database, which improves performance, while eliminating the time and costs of moving data among databases.
  • Identify and assess external data sources as well as internal and external data interfaces
  • Created and updated MS Excel mapping document by doing field level analysis and field mapping.

Environment TeradataV2R6, BTEQ, Teradata Manager, TeradataSQL Assistant, FastLoad, MultiLoad, Fast Export, Rational Clear Quest,Control-M,UNIX, MQ, NDM, FTP, SAS, Ab Initio (GDE 1.14, Co>OS: V1.14) EME

Confidential,New Berlin, WI Mar2007–Feb 2008

Teradata developer

Confidential, has different subsidiaries as Kinko’s Express, Ground and Freight. Currently revenue structure and customer structure are managed differently for these subsidiaries. The Project is developed to design & construct Enterprise Data warehouse for the organization, so as to have a single point of reference to get customers data from various Databases.

Responsibilities

  • Migrated tables from Oracle to Teradata.
  • Wrote BTEQ and Mload scripts to load data from Oracle to Teradata.
  • Helped V-MIS in preparing usage inventory document.
  • Analyzed the dependencies of existing job on Oracle data mart (BCMS).
  • Used UNIX/Perl scripts to access Teradata& Oracle Data.
  • Sourced data from Teradata to Oracle using Fast Export and Oracle SQL Loader.
  • Worked on InformaticaPowerCenter tool –Source Analyzer, Warehouse designer, Mapping and Mapplet Designer, Transformations, Informatica Repository Manager and Informatica Server Manager.
  • Informatica Metadata repository was created using the Repository Manager as a hub for interaction between the various tools. Security and user management, repository backup was also done using the same tool.
  • Informatica Designer tools were used to design the source definition, target definition and transformations to build mappings.
  • Created the mappings using transformations such as the Source qualifier, Aggregator, Expression, Lookup, Router, Filter, and Update Strategy.
  • Server Manager used for creating and maintaining the Sessions. Server Manger Also used to Monitor, edit, schedule, copy, aborts and deletes the session.
  • Used ETL for efficient mapping and transformation techniques.

Environment: Red Hat Enterprise Linux 64-bit, Windows XP, Informatica Power Center 7.1, NCR TeradataV2R6, Business Objects 11, Crystal Reports 10.

Confidential,Denver, COOct 2005 - Feb 2007
Teradata developer

The primary objective of this project was to enhance the existing Siebel ETL environment adding new ETL code and consolidating the underlying ETL migration architecture.

Responsibilities:

  • Involved in the analysis of the Issues and proposing the solutions to the client
  • Involved in the analysis of test results and documenting the test results.
  • Preparing the documents in details which will be shared across the organization
  • Create export scripts using Teradata Fast Export Utility.
  • Preparing Test cases and Test Data
  • Create data manipulation and definition scripts using Teradata BTEQ utilityInvolved in the analysis and design of the system
  • Involved in Testing of the prototype.
  • Create load scripts using Teradata Fast Load and Mload utilitiesCode Reviews and Code walk through of the prototype.
  • Create procedures in Teradata SQL.
  • Attending corporate meeting for the kick off of major enhancements at corporate level.
  • Organizing meeting with the SME’s of the dependent systems when changes are done for the existing system.
  • Create UNIX scripts for file manipulation.
  • Involved in preparation of architecture and infrastructure documentation.
  • Involved in Designing DR process for the ESB application
  • Involved in the Weekly issue meeting with the customer.
  • Organizing meeting with the department heads when changes are recommended for the existing system for performance improvement.
  • Organizing meeting with the team on daily basis.
  • Must be able to reverse engineer and document existing code
  • Involved in decision making for changing the existing programs for special processing.
  • Involved in creating and visualizing dashboards using Tableau Desktop.
  • Create Maps, Bar and line charts, heat maps extensively by using Tableau.

Environment:UNIX, Teradata RDBMS,BTEQ, Fast load, Teradata Administrator, MultiLoad, TPump, Power, FTP, SFTP

We'd love your feedback!