We provide IT Staff Augmentation Services!

Senior Data Analyst Resume

2.00/5 (Submit Your Rating)

Reston, VA

PROFESSIONAL SUMMARY:

  • Over 8+ years of experience in Business/Data analysis, ETL process, Reporting services, Database Design and Data modeling.
  • Experience in working wif distinct phases of Software Development Life Cycle (SDLC) in Waterfall and Agile methodology.
  • Experience in gathering user requirements and generate Business Requirement Document (BRD) and Functional Requirement Document (FRD), maintaining Data Dictionary and Meta Data.
  • Worked wifSSIStool like Import and Export Wizard, Package Installation, and SSIS Package Designer.
  • Extensive experience in functional testing, integration testing, regression testing, black box testing, GUI testing, back - end testing and browser compatibility testing.
  • Experience working wif Teradata utilities likeBTEQ, Fast Export, Fast Load, Multi Loadto export and load data to/from different source systems including flat files.
  • Createddatabases, users, tables, triggers, views, stored procedures, functions, Packages, joins and hash indexes in Teradata database.
  • Good at Designing complex Informatica mappings and re-usable transformations to facilitate incremental and initial data loads.
  • Skilled in working wif software testing tools such as HP ALM and IBM Rally for test cases, defect tracking and requirement mapping.
  • Skilled at writing User Stories and Functional Specifications Documents.
  • Excellent at creating Use Cases and UML diagrams such as Use Case and Activity diagram, Sequence diagram, E-R diagram and Data Flow diagram.
  • Expert wif SQL and querying Databases.
  • Experience wif Agile methodology, daily Scrums and 2-week sprints.
  • Very good in system analysis diagnostics, troubleshooting and conflict resolution.
  • Excellent knowledge in Relation Database Management System (RDBMS).
  • Performed transforming data from one server to other servers using Data Transformation Services (DTS) and SQL Server Integration Services(SSIS).
  • Worked on Data Blending to develop Tableau workbooks from multiple data sources.
  • Worked wif Bulk Copy Program (BCP) extensively for flat file import and export process to/from SQL Server.
  • Developed Tableau Visualization and Dashboard using Tableau Desktop.
  • Maintained existing issues reporting processes (ALM) and enforced an issue management workflow.
  • Strong in SQL and PL/SQL - stored procedures, views, cursors and triggers.
  • Strong in writing T-SQL, working on DTS, SSIS and SSRS.
  • Worked on data validation of teh results in Tableau by validating teh numbers against teh data in teh database table by querying on teh database.
  • Migrated SQL Server 2008 tables into CSV files using SSIS packages.
  • Experience in report writing using SQL Server Reporting Services (SSRS) and in creating several types of reports like table, matrix and chart report.
  • Proficient in writing queries and stored procedures in SQL Server 2008,2012.
  • Provided support to developers during analysis, design, coding and implementation of new applications.
  • Identified gaps and risk in teh process and automated reports using VBA resulting in 100% accuracy, time saving and risk reduction.
  • Good at writing UNIX/LINUX shell scripts and have advanced knowledge in Excel (LOOKUPs, Pivot tables, Charts, Formulas-IF ELSE, Macros).
  • Good at communication skills, Hardworking, self-motivated.
  • Excellent in analytical and problem-solving skills.

TECHNICAL SKILLS:

Data Warehousing: Informatica Power Center 9.x/8.x/7.x, Informatica Power Exchange 8.6.1, Datastage8.x, Informatica data quality, Informatica MDM

Databases: SQL Server 2012/2010/2008, Oracle 11g/10g, Teradata R13, R12, V2R6

SQL, T: SQL, PL/SQL, Unix Shell Scripts, CSS, HTML, XML, MHTML, PHP, Python, Java, C#, ASP.NET, VB.NET, C++, C

Testing Concepts: SDLC, STLC, Testing Levels, Testing Types

Other Tools: SQL Query Analyzer, SQL Enterprise Manager, SQL Server Management Studio, SQL Server Query Editor, Solution Explorer, Analysis Manager, DTS Services, UML, Informatica, Database Design and Normalization, Data Modeling, HP Quality Centre, ALM, MS Word, MS Excel, MS PowerPoint

Operating Systems: Windows, Unix

Methodologies: Agile Waterfall SDLC methodologies

PROFESSIONAL EXPERIENCE:

Confidential, Reston, VA

Senior Data Analyst

Responsibilities:

  • Interacted wif teh Business users to identify teh process metrics and various key dimensions and measures. Involved in teh complete life cycle of teh project.
  • Developed FRD (Functional requirement Document) and data architecture document and communicated wif teh concerned stakeholders. Conducted Impact and feasibility analysis.
  • Extensively usedTeradata Utilities likeFast-Load, MultiLoad, BTEQ & Fast-Export.
  • Created Teradata External loader connections such as MLoad, Upsert and Update, Fastload while loading data into teh target tables in Teradata Database.
  • Created salesforce External loader application and used this application to connect to salesforce.com.
  • CreatedMapplets, reusable transformationsand used them in different mappings.
  • Created Workflowsand used various tasks likeEmail, Event-wait and Event-raise, Timer, Scheduler, Control, Decision, Sessionin teh workflow manager.
  • Made use ofPost-Session successandPost-Session failurecommands in teh Session task to execute scripts needed for cleanup and update purposes.
  • Implementedparallelismin loads bypartitioningworkflows usingPipeline, Round-Robin, Hash, Key Range and Pass-through partitions.
  • Validation ofInformatica mappingsfor source compatibility due to version changes at teh source.
  • Trouble shooting of long runningsessionsand fixing teh issues.
  • Implementeddaily and weekly audit processfor theClaimssubject area to ensure Data warehouse is matching wif teh source systems for critical reporting metrics.
  • Developedshell scriptsfor Daily and weekly Loads and scheduled usingUnix Maestroutility.
  • Involved in writingSQL scripts, stored procedures and functionsand debugging them.
  • Prepared ETL mapping Documents for every mapping and Data Migration document for smooth transfer of project from development to testing environment and tan to production environment.
  • Involved inUnit testing, System testingto check whether teh data loads into target are accurate, which was extracted from different source systems according to teh user requirements.
  • Worked wif reporting team to help understand them teh user requirements on teh reports and teh measures on them. Helped them in creating canned reports.
  • Migrated repository objects, services and scriptsfrom development environment to production environment. Extensive experience in troubleshooting and solving migration issues and production issues.
  • Actively involved inproduction support. Implementedfixes/solutionsto issues/tickets raised by user community.

Environment: Informatica Power Center 8.6.1/9.0.1, Power Exchange 8.1, Teradata,Oracle 11g, salesforce, XML, XSD, IBM MQ Series,Unix, Unix Shell Script,Erwin, Micro strategy 7i.

Confidential, Lutherville, MD

Data Analyst

Responsibilities:

  • Involved in all phases ofSDLCfrom requirement, design, development, testing, Production and support for production environment.
  • Handled2 Terabytesof data warehouse database.
  • Involved inEnd-to-End developmentfrom OLTP systems to creation of Data Marts.
  • Exclusively worked on teh transformations likeAggregator - to get monthly, quarterly, Yearly claim aggregations,Connected & unconnected lookups get teh claim ids from other tables and also usedSQL overrideto filter teh data, Router to route teh fraud cases and claims settled andSequence Generator.
  • Involved inPerformance tuningof different tables like FIRST RESERVATIONS table which has data around 1 TB by creating partitions and using stored procedures.
  • Also involved intuningthe mappings in teh transformations by tracking teh reader, writer, transformation threads in teh session logs and used tracing level to verbose during development & only wif very small data sets.
  • Worked on flat files as sources, targets and lookups.
  • UsedPMCMDto start and abort sessions and tasks. Also usedINFACMDto track teh status of teh applications and automate teh service notification emails.
  • Implemented variousoptimizationtechniquesin Aggregator, Lookup, and Joiner transformation.
  • Developed mappings/sessions using Informatica Power Center 8.6 / 7.1 for data loading. Developed mapping to load teh data in slowly changing dimension (SCD).
  • Used Informaticaparameter filesto filter teh daily data from teh source system.
  • Involved in Data Quality Analysis to determine thecleansingrequirements.
  • Used variousdebugging techniquesand Informatica debugger tool to debug teh mappings.
  • Creating Test cases for Unit Test, System Integration Test and UAT to check teh data quality.
  • Created OracleStored Procedureto implement complex business logic for better performance.
  • CreatedMaterialized viewfor summary data to improve teh query performance
  • Responsible for loading data into warehouse from various sources usingOracle loaderto load millions of records and used Import/Export utility to load small tables.
  • Responsible forschedulingthe workflow based on teh nightly load.
  • Supported Oracle 10g databases running mission critical 24*7 systems.
  • Used PMCMD extensively to start, stop, schedule, and monitor Informatica workflows.

Environment: Informatica Power Center 8.6/7.1.1, Oracle 9i & 10g, PL/SQL, MicroStrategy 7i, Mainframe DB2, MS Visio, ERWIN, Data Modeling tool, TOAD, Windows 2000, UNIX AIX 5.1

Confidential, Dayton, OH

Data Analyst

Responsibilities:

  • Participated in requirements discussions wif business units, ensuring all business requirement were satisfactorily met.
  • Designed and implemented parameterized and cascading parameterized reports using SSRS.
  • Worked extensively wif teh QA team for designing Test Plan and Test Cases for teh User Acceptance Testing.
  • Created and maintained documentation on processes, application configuration, and material for users.
  • Experience in ETL tools like DTS for data flow from one source files like Tables, Views and Excel to other databases wif proper mapping.
  • Created and maintained databases, tables, views, users, logins, indexes, check constraint, and business rules using T-SQL.
  • Conducted Data analysis including acquisition, cleansing, transformation, modeling, visualization, documentation and presentation of results.
  • Worked wifTeradata SQL Queries using Teradata SQL Assistant for Ad Hoc Data Pull request.
  • Worked wif pivot tables in Excel by getting data from Teradata and Oracle.
  • Developed reports using SSRS on statistical data during migration of data.
  • Designed and implemented SQL queries for data analysis and data validation and compare data in test and production environment.
  • Tasked wif base-lining requirements in DOORS, and Change Management Configuration using DOORS.
  • Managed all requirements in DOORS and Rational Requisite Pro, making requirements available to all teh teams.
  • Created drill down, drill through, linked and sub reports using SQL Server Reporting Services (SSRS).
  • Involved in development, implementation, administration and support of ETL processes for teh Data Warehouse using DTS and SSIS.
  • Worked wif SSIS Import/Export Wizard for performing teh ETL operations.
  • Helped data modelers to prepare models and reviewed it to check if teh requirements are met.
  • Assisted Data Analyst in mapping task and analyzed existing reports.
  • Lead preparation of data warehouse requirements document.
  • Designed packages wif SSIS for workflow solutions and managed teh data flow.
  • Exported and imported data from text files and Excel to SQL Server database using BULK insert and BCP utility.

Environment: MS Project, MS Excel, MS Word, MS Visio, MS Access, Visual Basic, SQL, SQL Server, Oracle, Quality Center, Informatica Power Center 6.0

Confidential, Piscataway, NJ

ETL Developer

Responsibilities:

  • Developed ETL programs using Informatica to implement teh business requirements.
  • Communicated wif business customers to discuss teh issues and requirements.
  • Created shell scripts to fine tune teh ETL flow of teh Informatica workflows.
  • Used Informatica file watch events to pole teh FTP sites for teh external mainframe files.
  • Production Support has been done to resolve teh ongoing issues and troubleshoot teh problems.
  • Performance tuning was done at teh functional level and map level. Used relational SQL wherever possible to minimize teh data transfer over teh network.
  • Effectively used Informatica parameter files for defining mapping variables, workflow variables, FTP connections and relational connections.
  • Involved in enhancements and maintenance activities of teh data warehouse including tuning, modifying of stored procedures for code enhancements.
  • Effectively worked in Informatica version based environment and used deployment groups to migrate teh objects.
  • Used debugger in identifying bugs in existing mappings by analyzing data flow, evaluating transformations.
  • Effectively worked on Onsite and Offshore work model.
  • Pre-and post-session assignment variables were used to pass teh variable values from one session to other.
  • Designed workflows wif many sessions wif decision, assignment task, event wait, and event raise tasks, used informatica scheduler to schedule jobs.
  • Reviewed and analyzed functional requirements, mapping documents, problem solving and trouble shooting.
  • Performed unit testing at various levels of teh ETL and actively involved in team code reviews.
  • Identified problems in existing production data and developed one time scripts to correct them.
  • Fixed teh invalid mappings and troubleshoot teh technical problems of teh database.

Environment: Informatica Power Center 7.1, Informatica PowerMart, MS Access Reports, Unix Shell Scripting, SQL*Plus, Erwin, SQL*Loader, MS SQL Server 2008, Sun Solaris 2.7, DB2.

Confidential

ETL Developer

Responsibilities:

  • Resolving issues related to Enterprise Data Warehouse (EDW), Stored procedures in OLTP system and analyzed, design and develop ETL strategies.
  • Developed mappings, sessions and workflows in Informatica Power Center.
  • Identified performance issues in existing sources, targets and mappings by analyzing teh data flow, evaluating transformations and tuned accordingly for better performance.
  • Worked wif heterogeneous source to Extracted data from Oracle database, XML and flat files and loaded to a relational Oracle warehouse.
  • Developed standard and reusable mappings and mapplets using various transformations like Expression, Aggregator, Joiner, Router, Lookup (Connected and Unconnected) and Filter.
  • Migrated day one data for existing members in PCMH program from EDW into OLTP system using ETL process.
  • Performed tuning of SQL queries and Stored Procedure for speedy extraction of data toresolve and troubleshoot issues in OLTP environment.
  • Troubleshooting of long running sessions and fixing teh issues related to it.
  • Worked wif Variables and Parameters in teh mappings to pass teh values between sessions.
  • Involved in teh development of PL/SQL stored procedures, functions and packages to process business data in OLTP system.
  • Carried out changes into Architecture and Design of Oracle Schemas for both OLAP and OLTP systems.
  • Worked wif Services and Portal teams on various occasion for data issues in OLTP system.
  • Worked wif teh testing team to resolve bugs related to day one ETL mappings before production.
  • Creating teh weekly project status reports, tracking teh progress of tasks according to schedule and reporting any risks and contingency plan to management and business users.
  • Involved in meetings wif production team for issues related to Deployment, maintenance, future enhancements, backup and crisis management of DW.
  • Worked wif production team to resolve data issues in Production database of OLAP and OLTP systems.

Environment: Informatica Power Center 6.1,oracle 11g/10g/9i/8i, PL/SQL, SQL Developer 3.0.1, Toad 11

We'd love your feedback!