We provide IT Staff Augmentation Services!

Data/business Analyst Resume

0/5 (Submit Your Rating)

Plano, TX

SUMMARY

  • 7 years of IT experience in Data Analysis, Design, Development, Maintenance and Documentation for Data warehouse and related applications using ETL, BI tools, Client/Server and Web applications on UNIX and Windows platforms.
  • Comprehensive knowledge and understanding of healthcare IT, whole sale banking business lines, Financial Services in Anti money laundering and retail Credit risk in Fraud management platforms
  • Extensive experience developing Business requirements documents, High level process flow technical design documents and efficiently liaise with SME’s to transform business requirements to technical Process modelling frameworks
  • Statistical Data Analysis using SAS/BASE for data management, Analysis and report generation
  • Worked with heterogeneous relational databases such as Oracle and MS Access utilizing SQL.
  • Experience inDataMining,Datamapping and Data modeling and good understanding of teh ETL tools like AB Initio, SSIS and Informatica Power Center.
  • Experience in visual database design tools like MySQL Workbench and user interface tools like Data studio.
  • Experience in implementing advanced ETL methodologies using Informatica Power Center
  • Extensive analytics using Excel's advanced features such as macros, lookup functions, array formulas, pivot tables, conditional formatting and advanced data charting.
  • Experience with databases (Teradata, Oracle, SQL Server) and working with large volumes of data
  • Experience on writing, testing and implementation of SQL triggers, stored procedures, and functions using PL/SQL in Oracle 11g
  • Extensive experience with Extraction, Transformation and Loading (ETL) and Business Intelligence (BI) tools.
  • Experience on performance monitoring tools and dashboard reporting tools like eclipse BIRT and Cognos BI Analysis and Report Studio.
  • Comprehensive understanding to import and integrate data across Hadoop File management systems using HiveQL.
  • Experience in methodologies like Agile Scrum Model, Waterfall Model and Creating Process mapping, Use Cases, Sequence diagrams, Activity diagrams

TECHNICAL SKILLS

ETL TOOLS: TERADATA IBM Data Stage Informatica Power Center 9.x Oracle Data Integration

DATABASES: MySQL workbench and Data studio DB2 Microsoft SQL server Oracle 11g/10g /9i SQL Server 2008 MS Access 2007

WEB TECHNOLOGIES: JSP and Servlets, Spring frame works CSS - based layout, HTML JDBC

REPORTING TOOLS: HP QUALITY CENTRE Medidata Rave COGNOS BI BIRT design studio SQL Developer SQL* Loader, SQL*Plus

IDE: Windows 7/Vista/2000/XP/2003 Unix 5.2/4.3 Sun Solaris WinNT4.0 ECLIPSE NETBEANS VB

PROGRAMMING: SQL PL/SQL C++ Java UNIX Shell scripting

ANALYTICAL TOOLS: SAS & TABLEAU BUSINESS ANALYTICS XML

PROFESSIONAL EXPERIENCE

Confidential, Plano,TX

Data/Business Analyst

Responsibilities:

  • Data extraction, transformation and analysis in support of analytics projects or in response to requests from other stakeholders within marketing or across EBM and Work with Project Management, design, Manufacturers/vendors to obtain workable schedules.
  • Transformation of business requirement/rules document to technical process modelling frameworks using use cases and process flow diagrams in a high level technical design document working in conjunction with development teams and SME’s.
  • Execute SQL queries to extract, manipulate, and/or calculate Costing analysis (Cost reduction, Margins, Pricing, MSRP)
  • Perform solution Provider-vendor data consolidation to ensure accurate product information, maintain vendor/solution provider blacklists, and standardize contractual as well as legal agreements and streamline Event Based process chains with ARIS Express.
  • Use SAS programming statements to customize teh presentation of data to generate detail, summary and multi panel reports.
  • Standardize best practices across all telecom supply chain processes, Develop And monitor performance indicator reports for all supply chain functions using MS Excel Power pivot and SAS
  • Comprehensive knowledge of generating reports using Code Library (REPORT & TABULATE), Macros and create new styles and table layouts using SAS ODS.
  • Spreadsheet analysis, graphics, descriptive and inferential statistical analysis, report generation, database access, decision support using SAS BASE.
  • Ability to execute omni-channel marketing campaigns and find teh appropriate data sources to complete any requests for pre-analysis using Excel's advanced features such as macros, lookup functions, array formulas, pivot tables, conditional formatting and advanced data Charting.
  • Import and integrate data providers from multiple data warehouse infrastructure to address Ad hoc Queries to view data in terms of tables and joins from Hadoop distributed file system (HDFS) using Hive HiveQL.
  • Manipulate and process large amounts of structured and unstructured data using Business objects and format reporting using VLookup’s, Power Pivot and Power Query in MS Excel 2013
  • Create data inventories and identify opportunities for novel analytics using internal as well as external data, including public, government, social /digital and purchased data.
  • Carry out data mapping, table design and optimization and move data from spreadsheets and desktops into corporate data warehouses. This included creation of data models, including robust data definitions involving entity-relationship-attribute models, star, or dimensional models.
  • Extensive experience in ETL processes in extracting data from Operational and Legacy Systems to Data Marts using SQL server MDS(Configuration manager & deployment manager).
  • Establish and manage teh analytics infrastructure including SAS infrastructure, connections to enterprise Big data warehouses and Hadoop data lakes
  • Performed data profiling, Data capture, coding management and reporting data mapping for different sources using Cognos BI Analysis and Report Studio.
  • Used Teradata SQL assistant for managing repository (view and edit),establish data lineage among data marts, define custom routines & transforms, Import and export items between different DB2 systems or exchange metadata with other data warehousing tools.
  • OLAP, Drill down report, cross tab report, parent / child report generation using BIRT designer studio
  • Used Agile Scrum methodology to build teh different phases of Software development life cycle.

Environment: Tera data SQL assistant, Cognos BI Analysis Studio,DB2, Business objects Designer & Query, SQL, Power Pivot/Query, Oracle 10g, MS Visio, Hadoop, HiveQL,UNIX, SAS.

Confidential, DE

Business- Data Analyst

Responsibilities:

  • Creation of mapping documents, Business requirement documents and technical design documents and liaise between teh analysis team and SME.
  • Involved in cleansing of data, by writing complex SQL queries and efficient requirement capture using use cases.
  • Used several built in reporting tools for generating data for analysis using Cognos BI, Go Dashboards and SAS Base Analysis studio.
  • Formatting, Report Procedure, Graphic and text-based displays and statistically-oriented procedures in SAS packages.
  • Improved teh overall execution process and optimized Data analysis in Excel 2010 using SUMIFS, SUMPRODUCT, INDEX, MATCH, LOOKUP Functions.
  • Worked extensively in providing clean data for subsequent migration of front-end activity to digital channels
  • Designed mid and large sized Campaigns (Batch & CFM) for teh digital marketing and micro segmentation teams
  • Assisted in streamlined decision making process leveraging structured and non-structured consumer insights and analysis of digital payments data to enable faster processing
  • Generated reports after every level of execution for Business Owner’s to review teh data captured across digital platforms like SmartSpend and MyBarclays.
  • Worked as validation data analyst to validate teh campaigns developed by lead segmentation analyst
  • Designed and developed matrix and tabular reports with drill down, drill through and drop down menu option using SSRS.
  • UsedFACETSAnalytics for fast and easy retrieval, display and grouping of information for performing queries and generating reports.

Environment: Teradata SQL Assistant, Flat files, SQL, PL/SQL, Shell scripts, UNIX, MS Excel, SAS

Confidential, Thousand Oaks, CA

Business Analyst

Responsibilities:

  • Reviewing teh business requirements and working with SME’S team for bridging gaps and design high level functional documents describing technical details and work flow.
  • Creating and executing SQL queries to perform Data Integrity testing on an Oracle Database to validate and test data using TOAD.
  • Identified and tracked teh slowly changing dimensions/mini dimensions, heterogeneous Sources and determined teh hierarchies in dimensions..
  • Extensively used Informatica Designer Tool’s components such as source analyzer, transformation developer, mapping designer, mapplet designer, workflow manager and workflow monitor. performed data profiling, Data capture, coding, management and reporting data mapping for different sources using medidata rave
  • Writing teh test scenarios based on business requirements and Business use cases
  • Documented Test Cases corresponding to business rules and other operating conditions.
  • Wrote teh SQL queries on data staging tables and data warehouse tables to validate teh data results.
  • Executed sessions and batches in Informatica and tracked teh log file for failed sessions.
  • Validated teh data by reverse engineering methodology i.e. backward navigation from target to source.

Environment: Test Director, Informatica 8.1/7.1.4, SQL, PL/SQL, UNIX Shell Scripting, Oracle, MS SQL Server 2005, TOAD, Medidata

Confidential, Dallas, TX

Data Analyst

Responsibilities:

  • Develop Use Cases to uncover behavioral and functional requirements of teh data mart design and determining whether teh stated requirements are clear, complete, consistent and unambiguous, and resolving any apparent conflicts.
  • Utilized shared containers for code reusability in implementing teh predefined business logic and provide integrated data to teh web services digital teams based on requirements
  • Created and scheduled teh job sequences by checking job dependencies.
  • Wrote complex SQL queries using joins, sub queries and correlated sub queries.
  • Wrote PL/SQL stored procedures, functions and packages and triggers to implement business rules into teh application.
  • Developed shell scripts to invoke back end SQL and PL/SQL programs
  • Performed Unit Testing to check teh validity of teh data Confidential each stage.
  • Used DataStage Director to debug teh jobs and to view teh error log to check for errors.
  • Implemented best practices in teh development environment (code standards, code migration).

Environment: DataStage 6.0, Cognos BI, Advanced Excel, MS Access 2000

Confidential

Data Analyst

Responsibilities:

  • End-to-endETLdevelopment of teh Data Mart. Data Quality Analysis to determine cleansing requirements.
  • Responsible for teh Dimensional Data Modeling and populating teh business rules using mappings into teh Repository for Meta Data management.
  • Understand teh business needs, develop design documents and implement teh same into a functional database design.
  • Extensively usedETLandInformaticato load data from MS SQL Server, Excel spreadsheet, flat files into teh target Oracle database.
  • Worked onInformaticaPower Center tool - Source Analyzer, Data warehousing designer, Mapping & Mapplet Designer and Transformation Designer.
  • Loading teh Data from teh tables into teh OLAP application and further aggregate to higher levels for analysis.
  • Developed PL/SQL procedures/packages to kick off teh SQL Loader control files/procedures to load teh data into Oracle. Tuning InformaticaMappings and Sessions for optimum performance.
  • UsedInformaticafeatures to implement Type I & II changes in slowly changing dimension tables. Created and ran workflows and Worklets using Workflow Manager to load teh data into teh Target Database.
  • Performance tuning of SQL Queries in Sources and Targets sessions

Environment: InformaticaPower Center 6.2,InformaticaPower connect, OLAP, Oracle 8i, MS SQL Server 7.0/2000, PL/SQL, Erwin 3.5, Toad, Unix Shell Scripting.

Confidential

Data Warehousing/ Quality Assurance Analyst

Responsibilities:

  • Reviewing teh business requirements and working with business and requirements team for gaps found during teh review.
  • Creating and executing SQL queries to perform Data Integrity testing on an Oracle Database to validate and test data using TOAD.
  • Identified and tracked teh slowly changing dimensions/mini dimensions, heterogeneous Sources and determined teh hierarchies in dimensions.
  • Created test case scenarios, executed test cases and maintained defects in internal bug tracking systems
  • Developed and executed various manual testing scenarios and exceptionally documented teh process to perform functional testing of teh application
  • Performed extensive Data Validations against Data Warehouse.
  • Wrote teh SQL queries on data staging tables and data warehouse tables to validate teh data results.
  • Executed sessions and batches in Informatica and tracked teh log file for failed sessions.
  • Compared teh actual result with expected results. Validated teh data by reverse engineering methodology i.e. backward navigation from target to source.
  • Extensively worked on Backend using SQL Queries to validate teh data in teh database.
  • Responsible for creating test data for testing environment.

Environment: Data Stage, DB2, UNIX, IBM Mainframe, PL/SQL

Confidential

Data Analyst

Responsibilities:

  • Identified source systems, their connectivity, related tables and fields and ensured data consistency for mapping.
  • Created re-usable components using shared containers for local use or shared use.
  • Analyze, design, develop, implement and maintain moderate to complex initial load and Incremental load jobs to provide data to ODS.
  • Worked with Data stage client tools - Designer, Director and Administrator
  • Created jobs and sequences to extract allowances, benefits, brokers, claims, members, providers and employee group data with applying various business rules.
  • Used data stage administrator to create environmental variables and set some project level permissions.
  • Analyzed data issues in teh load and providing teh solution for teh issues, discussing with teh data owners and creating DS jobs to correct teh data
  • Developed jobs in data stage, using different stages like Transformer, Aggregator, Lookup,
  • Designed Data Stage ETL jobs for extracting data from heterogeneous source systems, transforming and finally loading into teh data marts.

Environment: IBM Mainframes,Informatica, COGNOS, SQL Developer, HP Quality Center,

We'd love your feedback!