We provide IT Staff Augmentation Services!

Data Analyst Resume

0/5 (Submit Your Rating)

Plano, TX

SUMMARY

  • 8 years of IT experience in DataAnalysis, Design, Development, Maintenance and Documentation for Data warehouse and related applications using ETL, BI tools, Client/Server and Web applications on UNIX and Windows platforms.
  • Comprehensive knowledge and understanding of healthcare IT, whole sale banking business lines, Financial Services in Anti money laundering and retail Credit risk in Fraud management platforms
  • Extensive experience developing Business requirements documents, High level technical design documents and efficiently liaise with SME’s to transform business requirements to technical frameworks
  • Statistical Data Analysis using SAS/BASE for data management, Analysis and report generation
  • Worked with heterogeneous relational databases such as Oracle and MS Access utilizing SQL.
  • Experience inDataMining,Datamapping and Data modeling and good understanding of the ETL tools like AB Initio, SSIS and Informatica Power Center.
  • Experience in visual database design tools like MySQL Workbench and user interface tools like Data studio.
  • Experience in implementing advanced ETLmethodologies using Informatica Power Center
  • Extensive analytics using Excel's advanced features such as macros, lookup functions, array formulas, pivot tables, conditional formatting and advanced data charting.
  • Experience with databases (Teradata, Oracle, SQL Server) and working with large volumes of data
  • Experience on writing, testing and implementation of SQL triggers, stored procedures, and functionsusing PL/SQL in Oracle 11g
  • Extensive experience with Extraction, Transformation and Loading (ETL) and Business Intelligence (BI) tools.
  • Experience on performance monitoring tools and dashboard reporting tools like eclipse BIRT and Cognos BI Analysis and Report Studio.
  • Knowledge of complete Software Development Life Cycle(SDLC), Software Testing Life Cycle (STLC) and Bug Life Cycle (BLC).
  • Experience in methodologies like Agile Scrum Model, Waterfall Model and Creating Process mapping, Use Cases, Sequence diagrams, Activity diagrams

TECHNICAL SKILLS

ETL TOOLS: TERADATA IBM Data Stage Informatica Power Center 9.x Oracle Data Integration

DATABASES: MySQL workbench and Data studio DB2 Microsoft SQL server Oracle 11g/10g /9i SQL Server 2008 MS Access 2007

WEB TECHNOLOGIES: JSP and Servlets, Spring frame works CSS - based layout, HTML JDBC

REPORTING TOOLS: HP QUALITY CENTRE Medidata Rave COGNOS BI BIRT design studio SQL Developer SQL* Loader, SQL*Plus

IDE: Windows 7/Vista/2000/XP/2003 Unix 5.2/4.3 Sun Solaris WinNT4.0 ECLIPSE NETBEANS VB

PROGRAMMING: SQL PL/SQL C++ Java UNIX Shell scripting

ANALYTICAL TOOLS: SAS & TABLEAU BUSINESS ANALYTICS XML

PROFESSIONAL EXPERIENCE

Confidential, Plano,TX

Data Analyst

Responsibilities:

  • Data extraction, transformation and analysis in support of analytics projects or in response to requests from other stakeholders within marketing or across EBM and Work with Project Management, design, Manufacturers/vendors to obtain workable schedules.
  • Transformation of business requirement/rules document to technical frameworks using use cases in a high level technical design document working in conjunction with development teams and SME’s.
  • Execute SQL queries to extract, manipulate, and/or calculate Costing analysis (Cost reduction, Margins, Pricing, MSRP)
  • Perform solution Provider-vendor data consolidation to ensure accurate product information, maintain vendor/solution provider blacklists, and standardize contractual as well as legal agreements and streamline vendor related data across their organization with SAS enterprise Packages
  • Use SAS programming statements to customize the presentation of data to generate detail, summary and multi panel reports.
  • Standardize best practices across all telecom supply chain processes, DevelopAnd monitor performance indicator reports for all supply chain functions using MS Excel Power pivot and SAS
  • Comprehensive knowledge of generating reports using Code Library (REPORT & TABULATE), Macros and create new styles and table layouts using SAS ODS.
  • Spreadsheet analysis, graphics, descriptive and inferential statistical analysis, report generation, database access, decision support using SAS BASE.
  • Ability to execute omni-channel marketing campaigns and find the appropriate data sources to complete any requests for pre-analysis using Excel's advanced features such as macros, lookup functions, array formulas, pivot tables, conditional formatting and advanced data Charting.
  • Import and integrate data providers from multiple data warehouses and data lakes and analyze data from business object designers.
  • Manipulate and process large amounts of structured and unstructured data using Business objects and format reporting using VLookup’s, Power Pivot and Power Query in MS Excel 2013
  • Create data inventories and identify opportunities for novel analytics using internal as well as external data, including public, government, social /digital and purchased data.
  • Carry out data mapping, table design and optimization and move data from spreadsheets and desktops into corporate data warehouses. This included creation of data models, including robust data definitions involving entity-relationship-attribute models, star, or dimensional models.
  • Extensive experience in ETL processes in extracting data from Operational and Legacy Systems to Data Marts using SQL server MDS(Configuration manager & deployment manager).
  • Establish and manage the analytics infrastructure including SAS infrastructure, connections to enterprise data warehouses and Hadoop data lakes
  • Performed data profiling,Data capture, coding management and reportingdata mapping for different sources using Cognos BI Analysis and Report Studio.
  • Used Teradata SQL assistant for managing repository (view and edit),establish data lineage among data marts, define custom routines & transforms, Import and export items between different DB2 systems or exchange metadata with other data warehousing tools.
  • OLAP, Drill down report, cross tab report, parent / child report generation using BIRT designer studio
  • Used the Agile Scrum methodology to build the different phases of Software development life cycle.

Environment: Tera data SQL assistant, Cognos BI Analysis Studio,DB2, Business objects Designer & Query, SQL, Power Pivot/Query, Oracle 10g, MS Visio, Hadoop, UNIX, SAS.

Confidential, DE

Data Analyst

Responsibilities:

  • Creation of mapping documents, Business requirement documents and technical design documents and liaise between the analysis team and SME.
  • Involved in cleansing of data, by writing complex SQL queries and efficient requirement capture using use cases.
  • Used several built in reporting tools for generating data for analysis using Cognos BI, Go Dashboards and SAS Base Analysis studio.
  • Formatting, Report Procedure, Graphic and text-based displays and statistically-oriented procedures in SAS packages.
  • Improved the overall execution process and optimized Data analysis in Excel 2010 using SUMIFS, SUMPRODUCT, INDEX, MATCH, LOOKUP Functions.
  • Worked extensively in providing clean data for subsequent migration of front-end activity to digital channels
  • Designed mid and large sized Campaigns (Batch & CFM) for the digital marketing and micro segmentation teams
  • Assisted in streamlined decision making process leveraging structured and non-structured consumer insights and analysis of digital payments data to enable faster processing
  • Generated reports after every level of execution for Business Owner’s to review the data captured across digital platforms like SmartSpend and MyBarclays.
  • Worked as validation data analyst to validate the campaigns developed by lead segmentation analyst
  • Designed and developed matrix and tabular reports with drill down, drill through and drop down menu option using SSRS.
  • UsedFACETSAnalytics for fast and easy retrieval, display and grouping of information for performing queries and generating reports.

Environment: Teradata SQL Assistant, Flat files, SQL, PL/SQL, Shell scripts, UNIX, MS Excel, SAS

Confidential, Thousand Oaks, CA

Data Analyst/QA analyst

Responsibilities:

  • Reviewing the business requirements and working with SME’S team for bridging gaps and design high level functional documents describing technical details and work flow.
  • Creating and executing SQL queries to perform Data Integrity testing on an Oracle Database to validate and test data using TOAD.
  • Identified and tracked the slowly changing dimensions/mini dimensions, heterogeneous Sources and determined the hierarchies in dimensions..
  • Extensively used Informatica Designer Tool’s components such as source analyzer, transformation developer, mapping designer, mapplet designer, workflow manager and workflow monitor. performed data profiling,Data capture, coding, management and reportingdata mapping for different sources using medidata rave
  • Writing the test scenarios based on business requirements and Business use cases
  • Documented Test Cases corresponding to business rules and other operating conditions.
  • Wrote the SQL queries on data staging tables and data warehouse tables to validate the data results.
  • Executed sessions and batches in Informatica and tracked the log file for failed sessions.
  • Validated the data by reverse engineering methodology i.e. backward navigation from target to source.

Environment: Test Director, Informatica 8.1/7.1.4, SQL, PL/SQL, UNIX Shell Scripting, Oracle, MS SQL Server 2005, TOAD, Medidata

Confidential, Dallas, TX

Data Analyst/Business Analyst

Responsibilities:

  • Develop Use Cases to uncover behavioral and functional requirements of the data mart design and determining whether the stated requirements are clear, complete, consistent and unambiguous, and resolving any apparent conflicts.
  • Utilized shared containers for code reusability in implementing the predefined business logic and provide integrated data to the web services digital teams based on requirements
  • Created and scheduled the job sequences by checking job dependencies.
  • Wrote complex SQL queries using joins, sub queries and correlated sub queries.
  • Wrote PL/SQL stored procedures, functions and packages and triggers to implement business rules into the application.
  • Developed shell scripts to invoke back end SQL and PL/SQL programs
  • Performed Unit Testing to check the validity of the data Confidential each stage.
  • Used DataStage Director to debug the jobs and to view the error log to check for errors.
  • Implemented best practices in the development environment (code standards, code migration).

Environment: DataStage 6.0, Cognos BI, Advanced Excel, MS Access 2000

Confidential

Data Analyst /QA Analyst

Responsibilities:

  • End-to-endETLdevelopment of the Data Mart. Data Quality Analysis to determine cleansing requirements.
  • Responsible for the Dimensional Data Modeling and populating the business rules using mappings into the Repository for Meta Data management.
  • Understand the business needs, develop design documents and implement the same into a functional database design.
  • Extensively usedETLandInformaticato load data from MS SQL Server, Excel spreadsheet, flat files into the target Oracle database.
  • Worked onInformaticaPower Center tool - Source Analyzer, Data warehousing designer, Mapping &Mapplet Designer and Transformation Designer.
  • Loading the Data from the tables into the OLAP application and further aggregate to higher levels for analysis.
  • Developed PL/SQL procedures/packages to kick off the SQL Loader control files/procedures to load the data into Oracle. Tuning InformaticaMappings and Sessions for optimum performance.
  • UsedInformaticafeatures to implement Type I & II changes in slowly changing dimension tables. Created and ran workflows and Worklets using Workflow Manager to load the data into the Target Database.
  • Performance tuning of SQL Queries in Sources and Targets sessions

Environment: InformaticaPower Center 6.2,InformaticaPower connect, OLAP, Oracle 8i, MS SQL Server 7.0/2000, PL/SQL, Erwin 3.5, Toad, Unix Shell Scripting.

Confidential

Data Warehousing/ Quality Assurance Analyst

Responsibilities:

  • Reviewing the business requirements and working with business and requirements team for gaps found during the review.
  • Creating and executing SQL queries to perform Data Integrity testing on an Oracle Database to validate and test data using TOAD.
  • Identified and tracked the slowly changing dimensions/mini dimensions, heterogeneous Sources and determined the hierarchies in dimensions.
  • Created test case scenarios, executed test cases and maintained defects in internal bug tracking systems
  • Developed and executed various manual testing scenarios and exceptionally documented the process to perform functional testing of the application
  • Performed extensive Data Validations against Data Warehouse.
  • Wrote the SQL queries on data staging tables and data warehouse tables to validate the data results.
  • Executed sessions and batches in Informatica and tracked the log file for failed sessions.
  • Compared the actual result with expected results. Validated the data by reverse engineering methodology i.e. backward navigation from target to source.
  • Extensively worked on Backend using SQL Queries to validate the data in the database.
  • Responsible for creating test data for testing environment.

Environment: Data Stage, DB2, UNIX, IBM Mainframe, PL/SQL

Confidential

Data Analyst

Responsibilities:

  • Identified source systems, their connectivity, related tables and fields and ensured data consistency for mapping.
  • Created re-usable components using shared containers for local use or shared use.
  • Analyze, design, develop, implement and maintain moderate to complex initial load and Incremental load jobs to provide data to ODS.
  • Worked with Data stage client tools - Designer, Director and Administrator
  • Created jobs and sequences to extract allowances, benefits, brokers, claims, members, providers and employee group data with applying various business rules.
  • Used data stage administrator to create environmental variables and set some project level permissions.
  • Analyzed data issues in the load and providing the solution for the issues, discussing with the data owners and creating DS jobs to correct the data
  • Developed jobs in data stage, using different stages like Transformer, Aggregator, Lookup,
  • Designed Data Stage ETL jobs for extracting data from heterogeneous source systems, transforming and finally loading into the data marts.

Environment: IBM Mainframes,Informatica, COGNOS, SQL Developer, HP Quality Center,

We'd love your feedback!