We provide IT Staff Augmentation Services!

Senior Data Analyst Resume

4.00/5 (Submit Your Rating)

Reston, VA

SUMMARY

  • Over 8+ years of experience in Business/Data analysis, ETL process, Reporting services, Database Design and Data modeling.
  • Experience in working with distinct phases of Software Development Life Cycle (SDLC) in Waterfall and Agile methodology.
  • Experience in gathering user requirements and generate Business Requirement Document (BRD) and Functional Requirement Document (FRD), maintaining Data Dictionary and Meta Data.
  • Worked withSSIStool like Import and Export Wizard, Package Installation, and SSIS Package Designer.
  • Extensive experience in functional testing, integration testing, regression testing, black box testing, GUI testing, back - end testing and browser compatibility testing.
  • Experience working with Teradata utilities likeBTEQ, Fast Export, Fast Load, Multi Loadto export and load data to/from different source systems including flat files.
  • Createddatabases, users, tables, triggers, views, stored procedures, functions, Packages, joins and hash indexes in Teradata database.
  • Good at Designing complex Informatica mappings and re-usable transformations to facilitate incremental and initial data loads.
  • Skilled in working with software testing tools such as HP ALM and IBM Rally for test cases, defect tracking and requirement mapping.
  • Skilled at writing User Stories and Functional Specifications Documents.
  • Excellent at creating Use Cases and UML diagrams such as Use Case and Activity diagram, Sequence diagram, E-R diagram and Data Flow diagram.
  • Expert with SQL and querying Databases.
  • Experience with Agile methodology, daily Scrums and 2-week sprints.
  • Very good in system analysis diagnostics, troubleshooting and conflict resolution.
  • Excellent knowledge in Relation Database Management System (RDBMS).
  • Performed transforming data from one server to other servers using Data Transformation Services (DTS) and SQL Server Integration Services(SSIS).
  • Worked on Data Blending to develop Tableau workbooks from multiple data sources.
  • Worked with Bulk Copy Program (BCP) extensively for flat file import and export process to/from SQL Server.
  • Developed Tableau Visualization and Dashboard using Tableau Desktop.
  • Maintained existing issues reporting processes (ALM) and enforced an issue management workflow.
  • Strong in SQL and PL/SQL - stored procedures, views, cursors and triggers.
  • Strong in writing T-SQL, working on DTS, SSIS and SSRS.
  • Worked on data validation of the results in Tableau by validating the numbers against the data in the database table by querying on the database.
  • Migrated SQL Server 2008 tables into CSV files using SSIS packages.
  • Experience in report writing using SQL Server Reporting Services (SSRS) and in creating several types of reports like table, matrix and chart report.
  • Proficient in writing queries and stored procedures in SQL Server 2008,2012.
  • Provided support to developers during analysis, design, coding and implementation of new applications.
  • Identified gaps and risk in the process and automated reports using VBA resulting in 100% accuracy, time saving and risk reduction.
  • Good at writing UNIX/LINUX shell scripts and have advanced knowledge in Excel (LOOKUPs, Pivot tables, Charts, Formulas-IF ELSE, Macros).
  • Good at communication skills, Hardworking, self-motivated.
  • Excellent in analytical and problem-solving skills.

TECHNICAL SKILLS

Data Warehousing: Informatica Power Center 9.x/8.x/7.x, Informatica Power Exchange 8.6.1, Datastage8.x, Informatica data quality, Informatica MDM

Databases: SQL Server 2012/2010/2008 , Oracle 11g/10g, Teradata R13, R12, V2R6

Languages: SQL, T-SQL, PL/SQL, Unix Shell Scripts, CSS, HTML, XML, MHTML, Java, C#, ASP.NET, VB.NET, C++, C

Testing Concepts: SDLC, STLC, Testing Levels, Testing Types

Other Tools: SQL Query Analyzer, SQL Enterprise Manager, SQL Server Management Studio, SQL Server Query Editor, Solution Explorer, Analysis Manager, DTS Services, UML, Informatica, Database Design and Normalization, Data Modeling, HP Quality Centre, ALM, MS Word, MS Excel, MS PowerPoint

Operating Systems: Windows, Unix

Methodologies: Agile Waterfall SDLC methodologies

PROFESSIONAL EXPERIENCE

Confidential, Reston, VA

Senior Data Analyst

Responsibilities:

  • Interacted with the Business users to identify the process metrics and various key dimensions and measures. Involved in the complete life cycle of the project.
  • Developed FRD (Functional requirement Document) and data architecture document and communicated with the concerned stakeholders. Conducted Impact and feasibility analysis.
  • Extensively usedTeradata Utilities likeFast-Load, MultiLoad, BTEQ & Fast-Export.
  • Created Teradata External loader connections such as MLoad, Upsert and Update, Fastload while loading data into the target tables in Teradata Database.
  • Created salesforce External loader application and used this application to connect to salesforce.com.
  • CreatedMapplets, reusable transformationsand used them in different mappings.
  • Created Workflowsand used various tasks likeEmail, Event-wait and Event-raise, Timer, Scheduler, Control, Decision, Sessionin the workflow manager.
  • Made use ofPost-Session successandPost-Session failurecommands in the Session task to execute scripts needed for cleanup and update purposes.
  • Implementedparallelismin loads bypartitioningworkflows usingPipeline, Round-Robin, Hash, Key Range and Pass-through partitions.
  • Validation ofInformatica mappingsfor source compatibility due to version changes at the source.
  • Trouble shooting of long runningsessionsand fixing the issues.
  • Implementeddaily and weekly audit processfor theClaimssubject area to ensure Data warehouse is matching with the source systems for critical reporting metrics.
  • Developedshell scriptsfor Daily and weekly Loads and scheduled usingUnix Maestroutility.
  • Involved in writingSQL scripts, stored procedures and functionsand debugging them.
  • Prepared ETL mapping Documents for every mapping and Data Migration document for smooth transfer of project from development to testing environment and then to production environment.
  • Involved inUnit testing, System testingto check whether the data loads into target are accurate, which was extracted from different source systems according to the user requirements.
  • Worked with reporting team to help understand them the user requirements on the reports and the measures on them. Helped them in creating canned reports.
  • Migrated repository objects, services and scriptsfrom development environment to production environment. Extensive experience in troubleshooting and solving migration issues and production issues.
  • Actively involved inproduction support. Implementedfixes/solutionsto issues/tickets raised by user community.

Environment: Informatica Power Center 8.6.1/9.0.1 , Power Exchange 8.1, Teradata,Oracle 11g, salesforce, XML, XSD, IBM MQ Series,Unix, Unix Shell Script,Erwin, Micro strategy 7i.

Confidential, Lutherville, MD

Data Analyst

Responsibilities:

  • Involved in all phases ofSDLCfrom requirement, design, development, testing, Production and support for production environment.
  • Handled2 Terabytesof data warehouse database.
  • Involved inEnd-to-End developmentfrom OLTP systems to creation of Data Marts.
  • Exclusively worked on the transformations likeAggregator - to get monthly, quarterly, Yearly claim aggregations,Connected & unconnected lookups get the claim ids from other tables and also usedSQL overrideto filter the data, Router to route the fraud cases and claims settled andSequence Generator.
  • Involved inPerformance tuningof different tables like FIRST RESERVATIONS table which has data around 1 TB by creating partitions and using stored procedures.
  • Also involved intuningthe mappings in the transformations by tracking the reader, writer, transformation threads in the session logs and used tracing level to verbose during development & only with very small data sets.
  • Worked on flat files as sources, targets and lookups.
  • UsedPMCMDto start and abort sessions and tasks. Also usedINFACMDto track the status of the applications and automate the service notification emails.
  • Implemented variousoptimizationtechniquesin Aggregator, Lookup, and Joiner transformation.
  • Developed mappings/sessions using Informatica Power Center 8.6 / 7.1 for data loading. Developed mapping to load the data in slowly changing dimension (SCD).
  • Used Informaticaparameter filesto filter the daily data from the source system.
  • Involved in Data Quality Analysis to determine thecleansingrequirements.
  • Used variousdebugging techniquesand Informatica debugger tool to debug the mappings.
  • Creating Test cases for Unit Test, System Integration Test and UAT to check the data quality.
  • Created OracleStored Procedureto implement complex business logic for better performance.
  • CreatedMaterialized viewfor summary data to improve the query performance
  • Responsible for loading data into warehouse from various sources usingOracle loaderto load millions of records and used Import/Export utility to load small tables.
  • Responsible forschedulingthe workflow based on the nightly load.
  • Supported Oracle 10g databases running mission critical 24*7 systems.
  • Used PMCMD extensively to start, stop, schedule, and monitor Informatica workflows.

Environment: Informatica Power Center 8.6/7.1.1, Oracle 9i & 10g, PL/SQL, MicroStrategy 7i, Mainframe DB2, MS Visio, ERWIN, Data Modeling tool, TOAD, Windows 2000, UNIX AIX 5.1

Confidential, Dayton, OH

Data Analyst

Responsibilities:

  • Participated in requirements discussions with business units, ensuring all business requirement were satisfactorily met.
  • Designed and implemented parameterized and cascading parameterized reports using SSRS.
  • Worked extensively with the QA team for designing Test Plan and Test Cases for the User Acceptance Testing.
  • Created and maintained documentation on processes, application configuration, and training material for users.
  • Experience in ETL tools like DTS for data flow from one source files like Tables, Views and Excel to other databases with proper mapping.
  • Created and maintained databases, tables, views, users, logins, indexes, check constraint, and business rules using T-SQL.
  • Conducted Data analysis including acquisition, cleansing, transformation, modeling, visualization, documentation and presentation of results.
  • Worked withTeradata SQL Queries using Teradata SQL Assistant for Ad Hoc Data Pull request.
  • Worked with pivot tables in Excel by getting data from Teradata and Oracle.
  • Developed reports using SSRS on statistical data during migration of data.
  • Designed and implemented SQL queries for data analysis and data validation and compare data in test and production environment.
  • Tasked with base-lining requirements in DOORS, and Change Management Configuration using DOORS.
  • Managed all requirements in DOORS and Rational Requisite Pro, making requirements available to all the teams.
  • Created drill down, drill through, linked and sub reports using SQL Server Reporting Services (SSRS).
  • Involved in development, implementation, administration and support of ETL processes for the Data Warehouse using DTS and SSIS.
  • Worked with SSIS Import/Export Wizard for performing the ETL operations.
  • Helped data modelers to prepare models and reviewed it to check if the requirements are met.
  • Assisted Data Analyst in mapping task and analyzed existing reports.
  • Lead preparation of data warehouse requirements document.
  • Designed packages with SSIS for workflow solutions and managed the data flow.
  • Exported and imported data from text files and Excel to SQL Server database using BULK insert and BCP utility.

Environment: MS Project, MS Excel, MS Word, MS Visio, MS Access, Visual Basic, SQL, SQL Server, Oracle, Quality Center, Informatica Power Center 6.0

Confidential, Piscataway, NJ

ETL Developer

Responsibilities:

  • Developed ETL programs using Informatica to implement the business requirements.
  • Communicated with business customers to discuss the issues and requirements.
  • Created shell scripts to fine tune the ETL flow of the Informatica workflows.
  • Used Informatica file watch events to pole the FTP sites for the external mainframe files.
  • Production Support has been done to resolve the ongoing issues and troubleshoot the problems.
  • Performance tuning was done at the functional level and map level. Used relational SQL wherever possible to minimize the data transfer over the network.
  • Effectively used Informatica parameter files for defining mapping variables, workflow variables, FTP connections and relational connections.
  • Involved in enhancements and maintenance activities of the data warehouse including tuning, modifying of stored procedures for code enhancements.
  • Effectively worked in Informatica version based environment and used deployment groups to migrate the objects.
  • Used debugger in identifying bugs in existing mappings by analyzing data flow, evaluating transformations.
  • Effectively worked on Onsite and Offshore work model.
  • Pre-and post-session assignment variables were used to pass the variable values from one session to other.
  • Designed workflows with many sessions with decision, assignment task, event wait, and event raise tasks, used informatica scheduler to schedule jobs.
  • Reviewed and analyzed functional requirements, mapping documents, problem solving and trouble shooting.
  • Performed unit testing at various levels of the ETL and actively involved in team code reviews.
  • Identified problems in existing production data and developed one time scripts to correct them.
  • Fixed the invalid mappings and troubleshoot the technical problems of the database.

Environment: Informatica Power Center 7.1, Informatica PowerMart, MS Access Reports, Unix Shell Scripting, SQL*Plus, Erwin, SQL*Loader, MS SQL Server 2008, Sun Solaris 2.7, DB2 .

Confidential

ETL Developer

Responsibilities:

  • Resolving issues related to Enterprise Data Warehouse (EDW), Stored procedures in OLTP system and analyzed, design and develop ETL strategies.
  • Developed mappings, sessions and workflows in Informatica Power Center.
  • Identified performance issues in existing sources, targets and mappings by analyzing the data flow, evaluating transformations and tuned accordingly for better performance.
  • Worked with heterogeneous source to Extracted data from Oracle database, XML and flat files and loaded to a relational Oracle warehouse.
  • Developed standard and reusable mappings and mapplets using various transformations like Expression, Aggregator, Joiner, Router, Lookup (Connected and Unconnected) and Filter.
  • Migrated day one data for existing members in PCMH program from EDW into OLTP system using ETL process.
  • Performed tuning of SQL queries and Stored Procedure for speedy extraction of data toresolve and troubleshoot issues in OLTP environment.
  • Troubleshooting of long running sessions and fixing the issues related to it.
  • Worked with Variables and Parameters in the mappings to pass the values between sessions.
  • Involved in the development of PL/SQL stored procedures, functions and packages to process business data in OLTP system.
  • Carried out changes into Architecture and Design of Oracle Schemas for both OLAP and OLTP systems.
  • Worked with Services and Portal teams on various occasion for data issues in OLTP system.
  • Worked with the testing team to resolve bugs related to day one ETL mappings before production.
  • Creating the weekly project status reports, tracking the progress of tasks according to schedule and reporting any risks and contingency plan to management and business users.
  • Involved in meetings with production team for issues related to Deployment, maintenance, future enhancements, backup and crisis management of DW.
  • Worked with production team to resolve data issues in Production database of OLAP and OLTP systems.

Environment: Informatica Power Center 6.1,oracle 11g/10g/9i/8i, PL/SQL, SQL Developer 3.0.1, Toad 11

We'd love your feedback!