We provide IT Staff Augmentation Services!

Sr. Data Analyst Resume

5.00/5 (Submit Your Rating)

West Bend, WI

SUMMARY

  • 9+ years of IT experience in the Analysis, design, development, testing and Implementation of ETL & Business Intelligence solutions using Data Warehouse/Data Mart Design, ETL, OLAP, Client/Server applications.Identify the producers and consumers of the masterdata. Collect and analyze metadata about for masterdata.
  • Experience as Data Analyst by gathering data from different sources,dataprofiling,datadefinition, and loaded those data on business warehouse. Worked on different kind of source systems and deeply understanding of business and accounting terms and formula.
  • Performed SAP Data Migration by using Business Objects Data Services as the ETL tool.
  • Experienced to work as a DataAnalyst to perform complex Data Profiling,Data definition,Data Mining, validating and analyzing data and presenting reports.
  • Good with data visualization tools like Tableau, SAP, Qlik, Amazon Qlikview, OBIEE, Google Analytics
  • Used Business Intelligence Reporting Tool (QlikView, Power BI) to create KPIs and Scorecards to improve Production support goals and measures.
  • Experience in providing reporting analytical thinking, process improvements, problem solving, time management, decision making and issue resolution.
  • Technical expertise in ETL methodologies, Power Mart, Client tools - Mapping Designer, Workflow Manager/Monitor and Server tools - Informatica Server Manager, Repository Server Manager, and Power Exchange.
  • Performed data analysis anddataprofiling using complex SQL on various sources systems including Oracle and Teradata.
  • Excellent Software Development Life Cycle (SDLC) with good working knowledge of testing methodologies, disciplines, tasks, resources and scheduling.
  • Experience in testing and writing SQL and PL/SQL statements.
  • Solid experience of creating PL/SQL packages, Procedures, Functions, Triggers and Views to retrieve, manipulate and migrate complexdatasets in Oracle Databases.
  • Excellent knowledge inDataAnalysis,DataValidation,DataCleansing,Data Verification and identifyingdatamismatch.
  • Experienced on Tableau Desktop, Tableau Server and good understanding of tableau architecture.
  • Experience and understanding of database design, relational integrity constraints, OLAP, OLTP, and normalizations.
  • Experience in Business Intelligence in generating various reports using Cognos and Business Objects.
  • Worked on Parameterize of all variables, connections at all levels in UNIX.
  • Directly responsible for the Extraction, Transformation & Loading ofdatafrom multiple sources intoDataWarehouse.
  • Knowledge of creating UNIX shell scripts to automatedataloading and to perform regular updates to database tables to keep in sync with the incomingdatafrom other sources.
  • Designed ETL Specification documents to loaddatafrom source to target using various transformations according to the business logic.
  • Inclusive knowledge of Waterfall, Agile, Risk Engineering,Datamodeling and Mapping.
  • Excellent experience in writing SQL queries to validatedatamovement between different layers indatawarehouse environment.
  • Excellent understanding ofDatawarehousing Concepts - Star and Snowflake schema, Normalization/De-Normalization, Dimension & Fact tables.
  • SQL scripts were developed for accessingdatafrom the Teradata tables.
  • Expertise in source to target mapping in Enterprise and CorporateDataWarehouse environments.
  • Strong knowledge of developing and implementing ETL processes using DQE, UNIX shell, DB2, and Oracle.
  • Good Knowledge on Amazon Web Services (AWS) Cloud services like EC2, S3, EBS, RDS, VPC, and IAM.
  • Advanced knowledge in ETL, DSS andDatawarehousing with OLAP technology.
  • Good Experience in preparing financial reports using oracle SQL.
  • Ability to work independently as well as collectively willing to relocate.

PROFESSIONAL EXPERIENCE

Confidential - West Bend, WI

Sr. Data Analyst

Responsibilities:

  • Conducted major stakeholder interviews involving SME's business analysts and other stakeholders to determine the requirements.
  • Developed and implemented ETL framework using SSIS, stored procedures, functions for extractions into SQL Server.
  • Created SSIS Packages fordatatransfer between servers, loaddatainto database, and archiveddatafile from various heterogeneous sources like SQL Server, Excel and CSV.
  • Experience in using SAS to read, import and export to anotherdatafile formats, including delimited files, DBF, MDB, Spreadsheet files.
  • Performed Regression Analysis on the dataset using SAS
  • Formulated Mapping Specification Documents ( ETL mapping,Qlikviewreportmapping)
  • Analyzed accuracy ofdatawarehouse fields for reporting tools including Business Warehouse andQlikView
  • InfluenceQlikviewDashboard integrity, stability, revisions, and evolution based on customer feedback.
  • Collaborating with end users to identify needs and opportunities for improveddatamanagement and delivery that can be seen inQlikview.
  • Overseeing the ongoing development and operations of an operationaldatastore and enterprisedata warehouse.
  • Improving and streamlining processes regardingdataflow anddataquality to improvedata accuracy, viability and value.
  • Experienced in Developing Power BI Reports and Dashboards from multipledatasources usingDataBlending.
  • Proficient in importing/exporting large amounts ofdatafrom files to Teradata and vice versa.
  • Worked with IDQ ondataquality fordatacleansing, robustdata, remove the unwanteddata, correctness ofdata.
  • Created parameter driven reports using parameters tab in the SSRS GUI procedures by setting up the report level parameters, along with building parameter driven queries and stored and placing them, in thedatafield.
  • Working on generating various dashboards in Tableau Server using differentdatasources such as Teradata, Oracle, Microsoft SQL Server and Microsoft Analysis Services.
  • DetailedDataMapping with the purpose of being able to locate any field on UI and simultaneously correlating it to the database field.
  • Provided quantitative analysis of website visits and optimized media spending by using statistical tests,dataminingtechniques anddatavisualization tools with R
  • Assess and coordinate the prioritization of requirements of cross enterprise business segments for in corporation into the enterprise masterdatamanagement (MDM) solution.
  • Created various geographical and time dimension reports of fiscal year, quarter month, week and daily reports for all domains and published through Reports Server.
  • Solid understanding of Business Process definition, Risk Analysis and SDLC methodologies.
  • Design and generation of different database objects like Tables, Views, Synonyms and Sequences, Pl/SQL packages and Database Triggers.
  • Used Business Intelligence tool sets and building solutions using these tools (Business Objects, Tableau, SAS, prefer business objects).
  • Create Mapping documents, ETL technical specifications and various documents related todata migration.
  • Wrote hundreds of DDL scripts to create tables and views in the companyDataWarehouse.
  • Extensively used ETL methodology for supportingdataextraction, transformations and loading processing, in a complex EDW using Informatica.
  • Involved in testing the XML files and checked whether data is parsed and loaded to staging tables.
  • Involved in developing simple & complex mappings using Informatica to load dimension & fact tables as per STAR Schema techniques.
  • Involved in designing database using MS Visio.
  • Responsible to design, develop and test the software (Informatica, PL SQL, UNIX shell scripts) to maintain thedatamarts (Loaddata, Analyze using OLAP tools).
  • Involved in developing reports using the Teradata advanced techniques like rank, row number and etc.
  • Created report schedules,dataconnections, projects, groups in Tableau server.
  • Implemented Agile Methodology for building an internal application.
  • Deployed SSRS reports to the reporting server in Native and SP integrated mode and assisted in troubleshooting deployment problems.
  • Developed consolidation process for number of MS SQL servers within the firm.
  • Responsible for Performing Requirement Analysis andDataAnalysis for various work streams with the help of SQL and Excel.
  • Good at SQL for querying and analysis purposes on various source tables and conditions applied and Wrote SQL joins, sub queries.
  • Created Standard and Ad-hoc reports using Cognos tool to estimate ETL process load statics.
  • Worked in tuning the ETL processes to reduce the processing time
  • Monitor the ETL process daily andreportanddatacorrection before the dailyreportis sent to the stakeholder and business partner
  • Provided keydataanalysis using ad-hoc MS SQL queries and Excel pivot tables.
  • ProcessedData-Entry and Entereddatainto and maintain several Excel spreadsheets
  • Analyzedataand generate reports using Excel and VBA

Confidential, Newark NJ

Data Analyst

Responsibilities:

  • Document all data mapping and transformation processes in the Functional Design documents based on the business requirements.
  • Developed prediction algorithm using advanceddataminingalgorithms to classify similar properties together to develop sub-markets; each zip code was divided into submarkets using advanceddata miningtechniques.
  • Worked on Parameterize of all variables, connections at all levels in UNIX.
  • Prepared High Level Logical Data Models using Erwin, and later translated the model into physical model using the Forward Engineering technique.
  • Assist in the deployment of QlikView BI tool by gathering reporting requirements, creating dashboard mockups,dataQA, testing, and training for both management and staff.
  • Involved in developing Dashboards featuring detailed analytical reports to promote greater business insights and a more informed decision making process usingQlikViewDeveloper tool.
  • Experience inreportwriting using SQL Server Reporting Services (SSRS) and creating various types of reports like drill down, Parameterized, Cascading, Conditional, Table, Matrix, Chart and Sub Reports.
  • Developed and deployed reports in MS SQL Server environment using SSRS.
  • Performed Data Profiling and Data Quality.
  • Experience working with large database to perform completedataretrieval, manipulation using multiple input files using SASdatasteps.
  • Hands on experience in providing Tableaureportaccess to the business users and reducing the space on the server for effective functioning by creating extracts and parameters in large reports.
  • Develop, test, implement and maintain reporting solutions using the Crystal Reports environment.
  • Working on Agile and Scrum methodologies.
  • Responsible for defining the key identifiers for each mapping/interface.
  • Created Technical specifications documents based on the functional design document for the ETL coding to build thedatamart.
  • Intelligence tools like Informatica Power Center,Datastage, Mainframes SAS, SSAS, OLTP, OLAP and Business Objects.
  • Extensively involved inDataExtraction, Transformation and Loading (ETL process) from Source to target systems using Informatica Power Center 9.1.
  • Participate in defining and establishing the Client MasterDatamanagement (MDM) organization within the EnterpriseDataOffice.
  • Documented, clarify, and communicate requests for change requests with the requestor and coordinate with the development and testing team.
  • In depth analysis on VSAM files to understand thedataanomaly and for draftingdatacleansing requirements.
  • Implementing various checkpoints of Software Development Life Cycle (SDLC) process, throughout different phases of a project: Analysis, Design, Develop, Test and Deploy.
  • Reverse engineered all the Source Database's using Erwin.
  • Expert in creating various SQL stored procedures, functions and temporary tables fordatainput to the Crystal Reports.
  • Worked with InformaticaDataQuality (IDQ) fordatacleansing,datamatching anddataconversion.
  • Understand the reporting requirements and provide Tableau reporting solutions.
  • Created complex reports using Crystal Reports.
  • Preparing the SQL script to verifydataand validatingdatafrom source and target Through PL/SQL Developer.
  • Designed parallel jobs using various stages like Join, Merge, Look up, Filter, Remove Duplicates,DataSet, Look up File Set, Complex Flat File, Modify, Aggregator, XML.
  • Reviewed basic SQL queries and edited inner, left, and right joins in Tableau Desktop by connecting live/dynamic and static datasets.
  • Useddataanalysis techniques to validate business rules and identify low quality missingdatain the existingdata.
  • Created Teradata objects like Databases, Users, Tables, Views and Macros.
  • Designed and developed weekly, monthly reports related to the marketing and financial departments using Teradata SQL.
  • Converted OBIEE reports and dashboards to Tableau reports and dashboards.
  • Prepared High Level LogicalDataModels using Erwin 9.1, and later translated the model into physical model using the Forward Engineering technique.
  • Responsibilities included analysis,datamapping, design, QA testing and documentation of applications in support of end-user access to theDataWarehouse.

Confidential, Cary, NC

Data Analyst / ETL Tester

Responsibilities:

  • Involved in Data mapping specifications to create and execute detailed system test plans. Thedata mapping specifies whatdatawill be extracted from an internaldatawarehouse, transformed and sent to an external entity.
  • Analyzed business requirements, system requirements,datamapping requirement specifications, and responsible for documenting functional requirements and supplementary requirements in Quality Center.
  • Setting up of environments to be used for testing and the range of functionalities to be tested as per technical specifications.
  • Created SSIS packages to extractdatafrom OLTP to OLAP systems and Scheduled Jobs to call the packages and Stored Procedures.
  • Leverageddataanalytics capability to createdataminingprocess to extract customer preference information.
  • Tested Complex ETL Mappings and Sessions based on business user requirements and business rules to loaddatafrom source flat files and RDBMS tables to target tables.
  • Wrote and executed unit, system, integration and UAT scripts in adatawarehouse projects.
  • Involved in designing the business requirement collection approach based on the project scope and the SDLC using Agile methodology.
  • Responsible for Business andDataAnalysis,DataProfiling,DataMigration,DataIntegration and Metadata Management Services.
  • Reversed engineered existingdatabases to understand thedataflow and business flows of existing systems and to integrate the new requirements to future enhanced and integrated system.
  • Involved in Data Analysis for thedatawarehouse anddatamart system for the process ofreport generation.
  • Wrote and executed SQL queries to verify thatdatahas been moved from transactional system to DSS,Datawarehouse,datamart reporting system in accordance with requirements.
  • Troubleshoot test scripts, SQL queries, ETL jobs,datawarehouse/datamart/datastore models.
  • Responsible for differentDatamapping activities from Source systems to Teradata.
  • Created the test environment for Staging area, loading the Staging area withdatafrom multiple sources.
  • Responsible for analyzing variousdatasources such as flat files, ASCIIData, EBCDICData, RelationalData(Oracle, DB2 UDB, MS SQL Server) from various heterogeneousdatasources.
  • Developed PL/SQL programs, stored procedures fordataloading anddatavalidations.
  • Involved in analyzing the sourcedatacoming from differentDatasources such as XML, DB2, Oracle, flat files.
  • Wrote hundreds of DDL scripts to create tables and views in the companyDataWarehouse.
  • Extensively worked on Shell scripts for running SAS programs in batch mode on UNIX.
  • Followed company code standardization rule.
  • Tested Cognos reports to check whether they are generated as per the company standards
  • Extensively used Star Schema methodologies in building and designing the logicaldatamodel into Dimensional Models.
  • Created views in Tableau Desktop that were published to internal team for review and furtherdata analysis and customization using filters and actions.
  • Performed ad hoc analyses, as needed, with the ability to comprehend analysis as needed.
  • Involved in Teradata SQL Development, Unit Testing and Performance Tuning and to ensure testing issues are resolved on the basis of using defect reports.
  • Tested the ETL process for both beforedatavalidation and afterdatavalidation process. Tested the messages published by ETL tool anddataloaded into various databases.
  • Wrote SQL scripts for analyzing the inbounddatafrom SAP system and having sufficient knowledge to querydatafrom databases.
  • Experience with Star and Snowflake Schema,DataModeling, Fact and Dimensional Tables and Slowly Changing Dimensions.
  • Experience in creating UNIX scripts for file transfer and file manipulation.
  • Provide support to client with assessing how many virtual user licenses would be needed for performance testing.
  • Ensuring onsite to offshore transition, QA Processes and closure of problems & issues.
  • Tested the database to check field size validation, check constraints, stored procedures and cross verifying the field size defined within the application with metadata.
  • Written Test Cases for Data warehouseETLto compare Source and Target database systems
  • Extensively used Data warehouseETLInformatica Debugger to validate maps and to gain trouble-shooting information about date and error conditions.
  • Tested several Data warehouseETLInformatica Mappings to validate the business conditions.
  • Involved in the error checking and testing of the Data warehouseETLprocedures and programs Informatica session log.
  • Used Software for Querying ORACLE. And UsedTeradataSQL Assistant for QueryingTeradata
  • Strong in writing SQL queries and makes table queries to profile and analyze the data in MS Access.
  • Performed Verification, Validation, and Transformations on the Input data (Text files, XML files) before loading into target database.
  • Monitored the Informatica workflows using Power Center monitor. Checked session logs in case of aborted/failed sessions
  • Performed Web Services testing and Validated XML request/response data using SOAPUI.
  • Used TOAD, DB Artisan tools to connect to Oracle Database to validate data that was populated byETLapplications
  • Worked with business team to test the reports developed in Cognos.
  • Involved in testing Unix Korn Shell wrappers to run variousETLScripts to load the data into Target Database (Oracle)
  • Created test cases and executed test scripts using Quality Center.
  • Involved in testing the XML files and checked whether data is parsed and loaded to staging tables

Confidential - Chicago, IL

Data Analyst

Responsibilities:

  • Responsible fordatamapping, metadata maintenance and enhancing existing Conceptual and Physicaldatamodels.
  • Conducted sessions with architects, business and development teams to understand the Change Requests (CR's), and effects caused on the system by the change requests.
  • Maintained and updated Metadata repository based on the change requests.
  • Analyzed existing Conceptual and Physicaldatamodels and altered them using Erwin to support enhancements.
  • Conducted walkthroughs with the DBA to convey the changes made to thedatamodels.
  • Utilized Rational Clear Quest to track change requests throughout the process.
  • Implemented SQL Scripts to modify thedatato resolve assigned defects.
  • Worked on other packages of the SQL server 2005 Edition SSRS (SQL Server Reporting Services), SSAS (SQL Server Analysis Services).
  • Worked simultaneously on variousdatasets / regions and created prediction models to guide the Attrition and retention team to identify the problems and assist them handle the business decisions effectively.
  • Created databases and schema objects including tables, indexes and applied constraints, connected various applications to the database and written functions, stored procedures and triggers.
  • Created DTS packages to copy tables, schemas and views and to extractdatafrom Excel and Oracle using SSIS.
  • Createddatamapping between two distinct models.
  • Co-ordinated with QA team to test and validate the reporting system and itsdata.
  • Involved in developing dynamic SQL queries for reports on EnterpriseDataWarehouse (OLAP) and Operational database (OLTP) Created Custom Logging in SSIS packages.
  • Involved in developing physicaldatamodels and created DDL scripts to create database schema and database objects.
  • Used BTEQ for SQL queries for loadingdatafrom source into Teradata tables.
  • Involved in developingDatamodels for Database (OLTP), the operationaldatastore (ODS),Data warehouse (OLAP), and federated databases to support client enterprise Information Management Strategy.
  • Implemented in depthdataanalysis on Oracle and Teradata systems.
  • Created job schedules to automate the ETL process.
  • Involved in developing Project document templates based on SDLC methodology.
  • Wrote documentation in single-source reusable chunks using XML authoring tool and updated legacy documentation using Frame Maker.
  • Performed daily tasks including backup and restore by using SQL Server 2005 tools like SQL Server Management Studio, SQL Server Profiler, SQL Server Agent, and Database Engine Tuning Advisor.
  • Involved in developing high performance PL/SQL code to extract, transform and load into adata warehouse.
  • Involved in developing algorithm and PL/SQL code for efficient retrieval and manipulation of complex datasets using PL/SQL packages.
  • Involved in developing Star and Snow Flake Schema.
  • Worked closely with the ETL SSIS Developers to explain the complexDataTransformation using Logic.
  • Created components, tools, techniques, methods and procedures used in an on OLAP environment for accessing and translatingdatainto understandable and usable business information using SSAS.
  • Involved in developing complex Stored Procedures and views to generate various Drill-through reports, parameterized reports and linked reports using SSRS.
  • Involved in extensive DATA validation using SQL queries and back-end testing.
  • Validated cube and query data from the reporting system back to the source system.
  • Tested analytical reports using Analysis Studio.
  • Involved in designing high level ETL architecture for overall data transfer from the source server to the Enterprise Services Warehouse which encompasses server name, database name, accounts, tables and direction of data flow, Data Mapping, Data dictionary and Metadata.

Confidential, Confidential, PA

Data Analyst

Responsibilities:

  • Worked with data investigation and discovery to identify every source of customerdata.
  • Responsible for researching data quality issues in customer enrollment channels (inaccuracies indata)
  • Identified inconsistencies indatacollected from different sources and worked with business owners / stakeholders to assess business and risk impact, provided solution to business owners.
  • Involved withdataprofiling of multiple sources using SQL Management Studio and Toad and presented initial discovery in Excel tables and reports.
  • Working knowledge in Analysis Services (SSAS), Reporting Services (SSRS) and Clustering.
  • Enhanced Crystal reports and DataWarehouse analytical models in support of strategic business planning and at customers' requests.
  • Identify opportunities through data mining and analysis that can lead to successful outcomes from the program.
  • Installed SSRS reporting service and developed SSRS reporting with advanced features, deployed intoreportmanger and assign permission to differ user group.
  • Created data mapping document to map customerdatafrom different systems to Enterprise customerdatamart.
  • Analyzed existing and future state database environment and created transformation rules for BI development team to build ETL for customer mart.
  • Used DataStage Designer to develop various jobs and extracteddatafrom various sources like Flat Files, Oracle into SAS.
  • Understand the business intelligence ofdatawarehousing environments and thedataflow from the source systems to the target systems.
  • Responsible for developing new projects and supporting existing applications.
  • Designed Enterprise reports using SQL Server Reporting Services (SSRS) which make use of multiple value selection in parameters pick list, cascading prompts, drill thru reports, drill down reports, dashboard reports, Matrix dynamics reports and other features of reporting service.
  • Develop and deliver dynamic reporting solutions using SQL Server Services (SSRS). Created Income statement and other financial reports using SSRS.
  • Designed and developed Use Cases, Activity Diagrams, Sequence Diagrams, OOD using UML
  • Performed analysis on enterprisedata/report integration & provided functional specification to development team to build Enterprise Reporting Systems.
  • Suggested various changes in physical model to support business requirements.
  • Developed server jobs to loaddatafrom flat files, text files, tag text files and MS SQL.
  • MigratedDatafrom MS Excel to SQL Server Reporting Service Using DTS and SQL loader utilities.
  • Wrote and executed unit, system, integration and UAT scripts in adatawarehouse projects.
  • Wrote and executed SQL queries to verify that data has been moved from transactional system to DSS, Data warehouse, data mart reporting system in accordance with requirements.
  • Troubleshoot test scripts, SQL queries, ETL jobs,datawarehouse/datamart/datastore models.

We'd love your feedback!