We provide IT Staff Augmentation Services!

Dataanalyst / Business Analyst Resume

4.00/5 (Submit Your Rating)

PROFESSIONAL SUMMARY:

  • 7 years of experience in the Information Technology (IT) Industry serving as a Data Analyst in a Data warehousing environment.
  • In depth knowledge of SDLC phases such as requirement analysis, design, development, implementation, deployment and maintenance with strong understanding of methodologies like Waterfall, Iterative and Agile.
  • Working knowledge of Health and Insurance domains.
  • Developed data mapping documentation to establish relationships between source and target tables including transformation processes using SQL.
  • Experienced to work as Data Analyst to perform complex Data Profiling, Data Definition, data mining, data analytics, validating and analyzing data and presenting reports.
  • Experience on business intelligence (and BI technologies) tools such as Data warehousing, reporting and querying tools, Data mining and Spreadsheets.
  • Expertise in Data consolidation and harmonization
  • Experienced in writing UNIX shell scripts, SQL Loader, Procedures/Functions, Triggers and Packages.
  • Vast experience of working in the area data management including data analysis, gap analysis and data mapping.
  • Excellent exposure in Data Profiling with reference to Data Warehouse and BI development.
  • Worked extensively with Dimensional Modeling and Data Migration.
  • Experience in Performing data validation, transforming data from RDBMS oracle to SAS datasets
  • Experience in DBMS/RDBMS implementation using object - oriented concept and database toolkit.
  • Extensive experience in Data Analysis and ETL Techniques for loading high volumes of data and smooth structural flow of the data.
  • Developed SQL scripts for data validation and cleansing.
  • Adept at writing Data Mapping Documents, Data Transformation Rules and maintaining Data Dictionary and Interface requirements documents.
  • Conducted extensive data validation on large data sets, communicated conclusions and recommendations to Project Manager.
  • Involved with data profiling for multiple sources and answered complex business questions by providing data to business users.
  • Ensured all documents and workflows meets the client requirements.
  • Worked with Data Modelers to create a Physical Data Model from the Logical Data Model using Erwin tool.
  • Experience with creating reports using Business Objects.
  • Excellent understanding of Microsoft BI toolset including Excel, Power BI, SQL Server Analysis Services, Visio, Access.
  • Experience in Creating master Data workbook which represents the ETL requirements such as mapping rules, physical Data element structure and their description.
  • Extensively used SAS procedures like means, frequency and other statistical calculations for Data validation.
  • Proficient in programming using SQL, PL/SQL.
  • Strong knowledge of database management in writing SQL queries using complex joins, grouping, aggregation, nested sub queries, and resolving key performance issues.
  • Extensive experience in creating simple and parameterized reports and complex reports involving sub reports, charts and graphs in various BI and data visualization tools like Tableau.

TECHNICAL SKILLS:

Software s: SharePoint, Spark, Microsoft Office (Word, PowerPoint Excel, Outlook), Adobe (Photoshop, Dream Weaver), Prezi

Reporting Tools:: SAP Business Objects, Costar, Web focus Information Builders

Data Analysis Tools:: Tableau Software, Python Advanced Excel (macros, VLOOKUP, HLOOKUP)

Programming models and methodologies:: Scrum, Sprint, SDLC life cycle, Waterfall, Agile

Project Management Tools:: MS Project, Workfront, Lucernex, WBS Chart Pro, JIRA

Modeling Tools:: MS Visio, Erwin, SQL Power Architect, Mockplus

Programming Languages:: C, C++, XML, Java, JavaScript, HTML, CSS

Database Management:: Microsoft Access, Oracle, SQL Server 2008, MySQL, IBM DB2

PROFESSIONAL EXPERIENCE:

DataAnalyst / Business Analyst

Confidential

Responsibilities:

  • Collaborate with developers, testers and end users to ensure the technical compatibility.
  • Extracting data from different databases as per the business requirements using Sql Server.
  • Worked on different mapping documents (FDW & IDW) which captures attributes pathing in the object model and data used to develop the workflow
  • Formulate and define business and/ or system scope and objectives based on both user needs and applicable industry requirements.
  • Work with users and stake holders to analyze communicate and validate requirements and then subsequently model them.
  • Devise or modify procedures to solve complex problems considering business and/ or system limitations, operating time and form of different results.
  • Created Use Case and process flow diagrams using MS Visio, Mockplus and document the steps involved with each actor.
  • Writing of SQL Queries to pull the required information from database.
  • Generates the XML via Test harness and test the xml if any attributes are missing from the particular Quote or Policy.
  • Adding of missing Policy, Claims or Billing attributes to the Schema.
  • Responsible for generating the XSD and verifying the newly added tags and upload it to Bit bucket after each sprint.
  • Responsible for updating Integration Definition workbook for every iteration with all the attributes.
  • Used to write more complex SQL queries for Oracle, DB2, SQL Server and other data sources.
  • Detects, documents and recommends approaches for handling data quality issues.
  • Provide Outstanding support to the business areas for data needs including ad-hoc data requests/analysis, user support and troubleshooting data related issues.
  • Create Dashboards using business intelligence (BI) tools such as Tableau, Power BI, OBIEEE based on the business user needs.
  • Articulates data derivation or transformation rules in business terminology.
  • Devoloped and documented functional and non-functional detailed requirements.
  • Providing technical expertise understanding the relationship between the business, data and the technology.
  • Perform data analysis using advanced MS-Excel features and VBA Macros to provide business insights and support in taking business decisions.
  • Support end to end system testing through out the integration and user acceptance by performing the bug triage to the appropriate tier in the system
  • Develop data queries to perform business analysis and data research.
  • Participate across multiple project methodologies like agile and waterfall.
  • Perform gap analysis, identify the data elements needed for downstream systems for operations and reporting needs
  • Creating complex technical mapping documents involving SQL queries and Xpath’s for integration of various systems implementing SOA structure.

Environment: Tableau, Mockplus, Oracle 10/11/12, Sql Server20012/2016, UML, Visio, Excel 2007/2010, VBA, Sql Management studio, sql developer, and XMLspy.

Data Analyst

Confidential

Responsibilities:

  • Involved in defining the Source To business rules, Target data mappings, data definitions.
  • Performing Data Validation / Data Reconciliation between disparate source and target systems (Salesforce, Cisco-UIC, Cognos, Data Warehouse) for various projects.
  • Worked on projects specific to reporting - data validation and review of all data entered into Access as well as all reports generated from Access and Lucernex.
  • Performing data profiling and analysis on different source systems that are required for Customer Master.
  • Extracting data from different databases as per the business requirements using Sql Server Management Studio.
  • Used SQL queries to pull the data from disparate systems and Data warehouse in different environments.
  • Perform periodic data integrity reviews to facilitate and monitor status for data cleanup to ensure consistent and reliable data flows.
  • Used Data Quality validation techniques to validate Critical Data elements (CDE) and identified various anomalies.
  • Saved the company money by identifying incorrect billing due to the lease such as caps on CAM with the help of Lucernex.
  • Interacting with the ETL, BI teams to understand / support on various ongoing projects.
  • Extensively used MS Excel for data validation.
  • Interacting with the Business teams and Project Managers to clearly articulate the anomalies, issues, findings during data validation.
  • Analyze data from site maintenance, identify issues, outline possible solutions and provide feedback to unit Manager.
  • Worked on projects specific to reporting - data validation and review of all data entered into Access as well as all reports generated from Access and Lucernex.
  • Utilize the newly developed Lease Controller software to capture leases and costs; design & document new SOX process for new leases/subcategories.
  • Writing complex SQL queries for validating the data against different kinds of reports generated by Cognos.
  • Generating weekly, monthly reports for various business users according to the business requirements.
  • Gathering the data from conversations and inputting into the CoStar database, Manipulating/mining data from database tables (Redshift, Oracle, Data Warehouse).
  • Providing analytical network support to improve quality and standard work results.
  • Ensure data files are produced in accordance with approved business logic rules.
  • Utilize a broad variety of statistical packages like SAS, R, MLIB, Graphs, Hadoop, Spark, Map Reduce and others.
  • Support validation team responses for site maintenance findings as well as corrective actions.
  • Create statistical models using distributed and standalone models to build various diagnostics, predictive and prescriptive solution.
  • Created and implemented procedure manuals for co-workers and future lease administration staff
  • Broad knowledge of programming, and scripting (especially in Python)
  • Improved processes and accuracy with advanced Excel tools including Pivot Tables, index, and VLOOKUP’s.
  • Interface with other technology teams to load (ETL), extract and transform data from a wide variety of data source.
  • Provide feedback in continuing validation process improvement.
  • Created additional data validation reports to help ensure accurate data and identify possible data issues.

Environment: Data Governance, SQL Server, Python, ETL, MS Office Suite - Excel (Pivot, VLOOKUP), DB2, R, Visio, HP ALM, Agile, Azure, Data Quality, Costar, Lucernex, Siterra, Tableau and Reference Data Management.

Data Validation Analyst

Confidential, Bloomfield, CT

Responsibilities:

  • Demonstrable expertise in core IT processes, utilizing ETL tools to query, validate, and analyze data.
  • Expert business and technical requirements documentation skills employing contemporary tools for data mapping, diagramming, Use Cases, and business rules to produce concise functional specifications.
  • Follow and assess the business process model defining metadata rules and critical data elements.
  • Conduct analysis, gather requirements, develop Use Cases, data mapping, and workflow diagrams.
  • Investigate unused modules of the DQM and report viability and feasibility for implementation.
  • Utilize multimedia office suite applications and conduct surveys for high level dashboard reporting.
  • Performing daily integration and ETL tasks by extracting, transforming and loading data to and from different RDBMS.
  • Interpret and analyze complex lease clauses such as tenant radius restrictions while maintaining current and accurate lease administration database information.
  • Creating complex SQL queries and scripts to extract and aggregate data to validate the accuracy of the data.
  • Business requirement gathering and translating them into clear and concise specifications and queries.
  • Prepare high level analysis reports with Excel and Tableau. Provides feedback on the quality of Data including identification of billing patterns and outliers.
  • Identify and document limitations in data quality that jeopardize the ability of internal and external data analysis.
  • Responsible for designing an approach for the application testing and the testing of interfaces that are integrated with the application.
  • Conduct data validation on inputs that trigger member communications including identifying high risk data files
  • Interacting with the Business teams and Project Managers to clearly articulate the anomalies, issues, findings during data validation.
  • Wrote standard SQL Queries to perform data validation and created excel summary reports (Pivot tables and Charts).
  • Gather analytical data to develop functional requirements using data modeling and ETL tools.
  • Improved Data quality using Macros, Advanced Excel functions, pivot tables & VLOOKUP to identify missing data.
  • Generated on data profiling and data validation for accuracy of the data between the Datawarehouse and source systems.
  • Used Ref cursors and Collections with bulk bind and bulk collect for accessing complex Data resulted from joining of large number of tables to extract data from data warehouse.
  • Fine Tuned (performance tuning) SQL queries and PL/SQL blocks for the maximum efficiency and fast response using Oracle Hints, Explain plans.
  • Used Teradata as a Source and a Target for few mappings.
  • Load data from MS Access database to SQL Server 2005 using SSIS (creating staging tables and then loading the data).
  • Extensively using MS Excel for data validation.
  • Highly proficient in using T-SQL for developing complex Stored Procedures, Triggers, Tables, Views, User Functions, User profiles, Relational Database Models and Data Integrity, SQL joins and Query Writing.
  • Migration of MS Access to SQL SERVER 2012.
  • Requirements gathering, analysis, Use Cases, data mapping, and workflow diagramming.
  • Data quality analysis and execution of the Data Quality Management (DQM) package.
  • Wrote SQL queries using analytical functions.
  • Documented Business Requirements, Functional Specifications, User stories.
  • Created UML based diagrams such as Activity diagrams using MS Visio.
  • Perform data extrapolation and validation of reports for analysis and audits.
  • Created T/SQL statements (select, insert, update, delete) and stored procedures.
  • Performed drill down analysis reports using SQL Server Reporting Services.
  • Documenting the extent to which data fails to meet threshold reporting requirements
  • Develop various SQL scripts and anonymous blocks to load data SQL Server 2005

Environment: Windows, MS Office (MS Word, MS Excel, MS PowerPoint, MS Access, MS SharePoint, MS Visio), SQL, SSIS, ETL, SSRS, Erwin, Tableau, SQL

Data Analyst

Confidential

Responsibilities:

  • Used Informatica Data Validation to create ETL Testing situation and conducted Data Validation before and after migration.
  • Archived and retrieved several applications using Informatica to archive the data and backload into the data warehouse.
  • Analyzed the relevant business information, transformed the data into a required business format and loaded into the Data Warehouse.
  • Used Informatica to create repeatable tests that validate PowerCenter mapping data.
  • Performed the validation between the development target and the testing target.
  • Used Informatica TDM tool for data masking. Tested complex masking rule, Plans in TDM to mask application
  • Perform periodic data integrity reviews to facilitate and monitor status for data cleanup to ensure consistent and reliable data flows
  • Used advanced excel to create pivot tables, charts, used VLOOKUP, and other functions.
  • Worked actively in ETL process and writing complex SQL queries for querying data against different databases (SQL Server and Netezza) for data verification process.
  • Created and executed all the test cases using Informatica PowerCenter.
  • Worked actively in data profiling, data scrubbing and data cleansing for Data Warehouse.
  • Written SQL queries to validate the data from Source to Target database.
  • Developed test scripts using QTP for Functionality and Regression Testing
  • Used Rally for Test Management and Defect Management.
  • Worked in OBIEE to generate reports and also test those reports according to the business needs.
  • Worked on automating most of the Manual testing scenarios using QTP.
  • Involved in writing automation script to validate the data from pre-stage to different staging layer and to the fact table.
  • Used advanced Excel features (Pivot Tables, VLOOKUP, Reporting) to analyze the data.
  • Used the UFT (Unified functional tool) for automation using the VB script as a scripting language.
  • Actively involved in Business meeting and the team meeting to understand the requirement properly.
  • Performed end to end data validation in the Historical Migration process from SQL Server to Netezza.
  • Performed end to end data validation in incremental data load in Netezza and making sure the data is flowing from Retention to Fact as per the lookup condition.
  • Used Informatica (Informatica Designer, Manager, and workflow) as ETL tool to check if the data is loading correctly and to debug the data using the debugger tool in Informatica.
  • Reported directly to the Director of Technology and Channel Managers on project metrics, status, schedules and risk assessments
  • Manage technical projects for software development and infrastructure improvements that encompass software and system upgrades.

Environment: Informatica, HPALM, OBIEE, SQL Server,Netezza, UNIX, Oracle, Jira, Agility Workbench, Flat Files and UNIX, Netezza, QTP

We'd love your feedback!