We provide IT Staff Augmentation Services!

Data Analyst Resume

4.00/5 (Submit Your Rating)

Boston, MA

PROFESSIONAL SUMMARY:

  • 7+ years of experience in Data Analyst, Development and Business Intelligence of Microsoft SQL Server 2000/ 2005/2008/2008 R2/2012 in Development, Test and Production Environments in financial, healthcare, retail and telecommunication domains
  • Hands on experience on Software Development Life Cycle (SDLC) of the project using methodologies like Waterfall, Agile, RUP and hybrid methods.
  • Adept at writing Data Mapping Documents, Data Transformation Rules and maintaining Data Dictionary and interface requirements documents.
  • Worked on integration and implementation of projects and products, database creations, modeling, calculation of object sizes, table spaces and database sizes
  • Handled modeling changes to legacy and future databases, using PowerDesigner and implemented them on Development and Test databases and moved the same to Production Instances during the production release.
  • Documented using the Metadata Repository (MDR) and the business rule spreadsheet (BRS) from the Corporate Data Warehouse (CDW) and the Integrated Operational Data Store (IODS) for financially reported Critical Data Elements (CDEs).
  • Extensive experience on Business Intelligence (and BI technologies) tools such as OLAP, data
  • Experienced to develop a customized mortgage accounts management solution to provide home mortgages, home refinancing, and home equity loans to consumers
  • Expert in working with Data Modeling tools including ERwin Data Modeler and Sybase PowerDesigner.
  • Expertise in Logical Modeling, Physical Modeling, Dimensional Modeling, Star and Snow - Flake schema.
  • Developed Data Migration and Cleansing rules for Integration Architecture (OLTP, ODS, DW)
  • Extensive success in translating Business Requirements and user expectations into detailed specifications employing Unified Modeling Language (UML).
  • Strong business analysis skills and an understanding of the software development life cycle (SDLC) utilizing Rational Unified Process (RUP).
  • Experience in designing/developing SQL statements and queries for Oracle and SQL Server 2005/2008/2008 R2/2012 databases for Data profiling and development.
  • Experience in writing/optimizing SQL queries in Oracle and SQL Server 2005/2008.
  • Skilled at conducting Requirement Analysis, use case design, test plans design and database schemas Development based on the logical models.
  • Expertise in SQL Server Integration Services (SSIS) and SQL Server Reporting Services (SSRS) with good knowledge on SQL Server Analysis Services (SSAS).
  • Experience in maintenance and Administration of SSIS by creating Jobs, Alerts, SQL Mail Agent, and schedule DTS/SSIS Packages
  • Experience in report writing using SQL Server Reporting Services (SSRS) and creating various types of reports like Drill Down, Drill Through, Parameterized, Cascading, Conditional, Table, Matrix, Chart and Sub Reports.
  • Experience in developing, monitoring, extracting and transforming data using DTS/SSIS, Import Export Wizard, and Bulk Insert.
  • Experience in SQL Server DTS and SSIS (Integration Service) package design, constructing, and deployment
  • Transformed data from one server to other servers using tools like Bulk Copy Program (BCP), Data Transformation Services (DTS) and SQL Server Integration Services (SSIS)
  • Experience in creating packages to transfer data between ORACLE, MS ACCESS and FLAT FILES to SQL SERVER using DTS/SSIS.
  • Hands on experience in Creating Star Schema and Snow Flake Schemas
  • Experienced in design/development of Data Warehouse applications using ETL and Business Intelligence tools like Informatica Power Center (9.5.1/8.1/7. x/6.x/5.x) /Power Mart 4.7, DataStage, Mainframes, SAS, SSAS, SSIS, OLTP, OLAP and Business Objects.
  • Facilitated/participated in Joint Application Development (JAD) sessions, and white board sessions with SMEs to keep executive staff and team members apprised of goals, project status, and issue resolution.

TECHNICAL SKILLS:

Operating Systems: Windows 2008/2003/2000/ NT, LINUX, UNIX.

Languages: SQL, PL/SQL, T SQL, HTML

Databases/software: Oracle 10g/9i/8i, SQL Server 2012/2008R 2/2008/2005/2000, DB2, UDB, MySQL, MS Access, MS Office, SharePoint, MS Excel

Rational Tools: Rational Rose, Rational RequisitePro, Rational ClearQuest, Rational Clear Case and SoDA

ETL Tools: Ab Initio, Informatica 8.1/7.x/6.x/5.x., Data Stage, SAS, SSIS, SSAS

Data modeling tools: Sybase PowerDesigner, CA Erwin, MS Visio

Reporting tools: SSRS, Crystal Reports, and Business Objects

Methodologies: UML, RUP, Ralph Kimball’s Dimensional Data Mart modeling

PROFESSIONAL EXPERIENCE:

Confidential, Boston, MA

Data Analyst

Responsibilities:

  • Involved in the projects from requirement analysis to better understand the requirements and support the development team with the better understanding of the data related to mortgage, origination, lien, and default.
  • Initiated and conducted JAD sessions inviting various teams to finalize the required data fields and their format.
  • Worked with OLAP tools such as ETL, Data warehousing and Modeling
  • Worked as a developer in creating Stored Procedures, triggers, function, tables, and otherSQLjoins and statements for applications.
  • Created reports to retrieve data using database code objects, such as Stored Procedures, views, functions and multiple T-SQLqueries.
  • Report Design and Coding for Standard Tabular type reports, including Drill down and Drill through functionality and Graphical presentations such as Charts and Dashboard type metrics.
  • Developed data access queries/programs usingSQLQuery to run in production on a regular basis.
  • Created report snapshots to improve the performance of SSRS.
  • Created report models for ad hoc reports when the end user wants to see the reports on the fly.
  • Responsible for Monitoring and Tuning Report performance by analyzing the execution plan of the report.
  • Worked extensively in creating the DMR Mapping Spreadsheets, Source to Target Mapping documents.
  • Analyzed the transformation rules and Joins/Filter logic documents.
  • Worked as a member of the Data lineage Team in the Enterprise Data Program division to create the Data Functional Design Documents.
  • Reverse Engineered databases to understand the relationship between already existing tables.
  • Created an initial data model (conceptual) to in corporate the new data elements for new functionality.
  • Created Logical/Physical data models from the conceptual model identifying the cardinality between the tables.
  • Prepared the Joins/Filter logic documents that would help the ETL design team perform the Joins on the tables that are in the form of flat files before loading them to FDS or any downstream systems.
  • Generated DDL scripts for the developed physical model.
  • Created Source to target Mapping Matrix in loading the data to Dimensions and Fact Tables.
  • Worked with data modelers to handle the modeling changes to legacy and future databases, and implementing them on Development and test databases and moving the same to Production Instances during the production release.
  • Experienced in design/development of Data Warehouse applications using ETL and Business Intelligence tools like Informatica Power Center (9.5.1/8.1/7. x/6.x/5.x) /Power Mart 4.7, DataStage, Mainframes, SAS, SSAS, SSIS, OLTP, OLAP and Business Objects.
  • Created use case scenarios to test performance of the Finance Data Store.
  • Worked with the CCG (Consumer and Credit group) team to develop a star schema for the CCG - Collections Data Mart identifying the Fact and Dimension tables.
  • Involved in Semantic layer design. Building views based on requirements given by users.

Environment: Oracle, SQL Server 2008 R2/2012, Windows 2008, OLAP, VB.NET, ETL Microsoft Access, Microsoft Visio, Microsoft Excel, Remedy.

Confidential, Baltimore, MD

Data Analyst

Responsibilities:

  • Interacted with the business community and database administrators to identify the business requirements and data realties.
  • Responsible for dimensional modeling of the data warehouse to design the business process.
  • Designed new database tables to meet business information needs. Designed Mapping document, which is a guideline to ETL Coding.
  • Developed a number of Informatica Mappings, Mapplets and Transformations to load data from relational and flat file sources into the data warehouse.
  • Used SQL queries and database programming using PL/SQL (writing Packages, Stored Procedures/Functions, and Database Triggers).
  • Work with various transformations like Source Qualifier, Lookup, Update Strategy, Router, Filter, Sequence Generator, and Joiner on the extracted source data according to the business rules and technical specifications.
  • Work with team to load the data from IMS files into Oracle tables using Informatica with any data cleansing required.
  • Worked on SQL tools like TOAD to run SQL queries and validate the data.
  • Worked on database connections, SQL Joins, views in Database level.
  • Work with team members using SQL Loader to load data from flat files to database tables in Oracle.
  • Optimized the mappings using various optimization techniques and also debugged some existing mappings using the Debugger to test and fix the mappings.
  • Implemented procedures/functions in PL/SQL for Stored Procedure Transformations.
  • Monitored workflows and collected performance data to maximize the session performance.

Environment: Informatica 7.1, MS SQL Server 2008, SSIS, SASS, Oracle 10g, ERwin 7, Windows 2003, IBM DB2, PL/SQL, SQL.

Confidential, Chula Vista, CA

Senior Data Analyst

Responsibilities:

  • Involved in extraction, transformation and loading of data directly from different source systems (flat files/Excel/Oracle/MSSQL/Teradata) using SAS/SQL, SAS/macros.
  • Extensively read different forms of Input files like CSV and other formatted files using In-file, Proc Import and documented using SAS scripting to develop data cleaning operations.
  • Created and modified several database objects such as Tables, Views, Indexes, Constraints, Stored procedures, Packages,Functions and Triggers using SQL and PL/SQL.
  • Created large datasets by combining individual datasets using various inner and outer joins in SAS/SQL and dataset sorting and merging techniques usingSAS/Base.
  • Wrote Python scripts to parse XML documents and load the data in database.
  • Query Data from Hadoop/Hive & MySQL data sources to build visualization inTableau.
  • Validated regulatory finance data and created automated adjustments using advancedSASMacros, PROC SQL, UNIX and various reporting procedures.
  • Performed statistical data analysis, generated ad-hoc reports, tables, listings and graphs using tools such asSAS/Base,SAS/Macros,SAS/Graph,SAS/SQL andSAS/STAT.
  • Designed and developed various analytical reports from multiple data sources by blending data on a single worksheet inTableauDesktop. Involved in creating Tree Map, Heat maps and background maps.

Environment: Windows 8, SAS v9.4, SAS/EG, Python, SAS/Base, SAS/Access, SAS/Macro, SAS/SQL, SAS/Graph, SAS/STAT, SAS/Connect, MS SQL Server, Oracle, Teradata, MS-Excel, Tableau 8.2.

Confidential

Data Analyst

Responsibilities:

  • Identify business, functional and Technical requirements through meetings and interviews and JAD sessions.
  • Define the ETL mapping specification and design the ETL process data from sources and load it into Data Warehouse tables.
  • Designed the logical and physical schema for data marts and integrated the legacy system data into data marts.
  • Integrate Data stage Metadata to Informatica Metadata and created ETL mappings and workflows.
  • Designed mapping and identified and resolved performance bottlenecks in Source to Target, Mappings.
  • Developed Mappings using Source Qualifier, Expression, Filter, Look up, Update Strategy, Sorter, Joiner, Normalizer and Router transformations.
  • Involved in writing, testing, and implementing triggers, stored procedures and functions at database level using PL/SQL.
  • Developed Stored Procedures to test ETL Load per batch and provided performance optimized solution to eliminate duplicate records.
  • Involved with the team on ETL design and development best practices.

Environment: Informatica 7.1, Power Exchange, IBM Rational Data Architect, MS SQL Server 2005, PL/SQL, IBM Control Center, TOAD, MS Project Plan

We'd love your feedback!