We provide IT Staff Augmentation Services!

Data Analyst Resume

3.00/5 (Submit Your Rating)

Irving, TexaS

SUMMARY

  • Over 10+ years of IT experience in the field of Business & Data analysis, ETL Development, ETL Testing and Dsata Modeling.
  • Strong experience in Business and Data Analysis, Data Profiling, Data Mining, Data Migration, Data Integration, Data architecture and Metadata Management Services and Configuration Management.
  • Linked data lineage to data quality and business glossary work within the overall data governance program.
  • Implemented Data Governance using Excel and Collibra.
  • Experience implementing Collibra to automate data management processes.
  • Performed an end to end Data Lineage assessment and documentation for select CDEs.
  • Gathered requirements by working with the business users on Business Glossary, Data Dictionary, Reference data, and Policy Management.
  • Participated in the Data Governance working group sessions to create Data Governance Policies.
  • Experience in Automating ETL processes for dynamic, automated data migration and validation by using Informatica, SSIS and Alteryx.
  • Lead & delivered Master DataGovernance, Data Quality (DQ), D&B enrichment and cleansing solution for large French based global life - sciences company.
  • Extensive experience in Relational and Dimensional Data modeling for creating Logical and Physical Design of Database and ER Diagrams with all related entities and relationship with each entity based on the rules provided by the business manager using ERWIN. Experienced in doing performance Tuning of Informatica objects- Finding the bottle necks at source, Target and mapping level and eliminating the with tuning methods.
  • Strong expertise in using ETL Tool Informatica Power Center 8.6 /9 (Designer, Workflow Manager, Repository Manager, Data Quality (IDQ) and ETL concepts.
  • Extensive Experience on Creating Visualization Dashboards using Tableau and Alteryx.
  • Very large database optimization and performance tuning to address data latency requirements utilizing performance analysis, execution plans, Bitmap Indexing, Partitioning, Materialized Views, Parallel Queries, Merge and Pipelines.
  • Enterprise wide applications security and compliance, corporate standards and best practices.
  • Collaboration software (SharePoint), technical specifications and system documentation.
  • Good Knowledge of ETL Processes such source, mapping, transformation, staging areas and created various ETL documents such as ETL Mapping documents, Data Mapping documents, ETL test scripts etc.
  • Excellent knowledge of Software Development Life Cycle (SDLC), Adhered to standards/guidelines through out the development life cycle and maintenance of the models.
  • Possess strong Documentation skill and knowledge sharing among Team, conducted data modeling sessions for different user groups, facilitated common data models between different applications, participated in requirement sessions to identify logical entities.
  • Extensive Experience working with business users/SMEs as well as senior management.
  • Strong understanding of the principles of Datawarehousing using Fact Tables, Dimension Tables, star and snowflake schema modeling.
  • Strong experience with Database performance tuning and optimization, query optimization, index tuning, caching and buffer tuning.
  • Extensive experience in backend programming including Database table design, Stored procedures, Triggers, Indexes, Performance tuning in DB2/Netezza/Teradata and Oracle .
  • Good knowledge in analysis and reporting using various Business Intelligence tools such as Cognos and Business Objects.
  • Excellent verbal, presentation and written communication skills. Able to communicate with technical, non-technical and senior management audiences. Have played a significant role in requirements planning, client interaction and documenting.
  • Excellent problem solving skills with a strong technical background and result oriented team player with excellent communication and interpersonal skills.

TECHNICAL SKILLS

Data modeling: Erwin,(8.2,8.1,7.3.5) Power designer,Embarcadero

Databases/Tools: Teradata,12.0Netezza,Oracle 15g/11g MS SQL Server 2014, MySQL, Sybase 11.5/11.2, MS Access, Cobol Copybook, Mainframe Files, DB2 UDB, XML, SQL Navigator, Toad

ETL Tools: Alteryx,Informatica PowerCenter 8x, 7x/6.1, 5.1, DataStage, Ab initio

BI Tools: Tableau, SAP BW,Business Objects, Cognos,OBIEE,Crystal Reports 2008

Architectures: OLAP, OLTP, OOP, SOA, Client-Server

Databases: Teradata, SQL SERVER, Oracle, Mongo DB

Operating Systems: UNIX, Linux 7.0+, Macintosh 9+, Windows 95/98/2000/XP

Programming Language: sC, C++, JAVA, XML, HTML, SQL, JavaScript, Perl, PHP, Shell Scripting, websrevices.

Software Engineering Tool: Rational Rose, Requisite Pro

PROFESSIONAL EXPERIENCE

Confidential, Irving, Texas

Data Analyst

Responsibilities:

  • Worked Extensively on Various Projects implementing Database changes in the respective Data Models and implementing Reverse Engineering of Database Structure to enhance the models and also comparing the models to in corporate changes
  • Created 20 workflows using Alteryx pulling data from relational and no relational databases
  • Designed and developed ETL workflows and datasets in Alteryx.
  • Processed data in Alteryxto create TDE And Hyper files for tableau reporting.
  • Create analytical application in alteryx designer and store on alteryx server for non - technical user
  • Create In-database alteryx workflows and data preparation for visualization in Tableau & Power-BI
  • Create ETL between different data warehousing such as snowflake & Redshift via Alteryx workflow
  • Extensively used Erwin and Normalization Techniques to design Logical/Physical Data Models, relational database design.
  • Worked with Informatica Cloud to create Source /Target connections, monitor, and synchronize the data with salesforce.
  • Worked with Informatica cloud for creating source and target objects, developed source to target mappings.
  • Worked on Informatica Performance Tuning identifying and resolving performance bottlenecks in various levels like sources, targets, mappings and sessions.
  • Developed ETL programs using Informatica Power center 9.6.1/9.5.1 to implement the business requirements.
  • Linked data lineage to data quality and business glossary work within the overall data governance program.
  • Implemented Data Governance using Excel and Collibra.
  • Experience implementing Collibra to automate data management processes.
  • Performed an end to end Data Lineage assessment and documentation for select CDEs.
  • Gathered requirements by working with the business users on Business Glossary, Data Dictionary, Reference data, and Policy Management.
  • Participated in the Data Governance working group sessions to create Data Governance Policies.
  • Experience in Automating ETL processes for dynamic, automated data migration and validation by using Informatica, SSIS and Alteryx.
  • Lead & delivered Master DataGovernance, Data Quality (DQ), D&B enrichment and cleansing solution for large French based global life-sciences company.
  • DevelopedDataMapping, DataGovernance, and Transformation and cleansing rules for the Master DataManagement Architecture.
  • Qliksense dashboard to present current data quality and effectiveness with data cleansing.
  • Participated in Requirement Gathering, Business Analysis, User meetings, discussing requirements for new table design in existing Data base and discussing about new columns for existing tables in Database.
  • Worked on the Data Warehouse team analyzing data files that had to be compiled from disparate non-production sources and readied them for production. Tasks included: comparing data to requirements documentation, creation of data layouts, data dictionary, and the pipe delimited and fixed width files. In addition, was team lead for managing the completion and loading of these files for the Data Management group
  • Conducted data modeling sessions for different user groups, facilitated common data models between different applications, participated in requirement sessions to identify logical entities.
  • Well-versed in understanding existing logical data models and constructing physical data models from same.
  • Designed, Developed, Performed Testing and Implementation of ETL processes using Informatica Cloud
  • Developing ETL / Master data management (MDM) processes using Informatica Cloud Platform
  • Dig deep into complex T-SQL Query and Stored Procedure to identify items that could be converted to Informatica Cloud ISD
  • Worked on transformations such as Lookup, Aggregator, Expression, Joiner, Filter, Rank, Sorter, Router, Sequence Generator, XML transformation etc using Informatica Cloud
  • Performed Informatica Cloud workflow Integration, system and regression testing and reported the test results to the users with the Traceability matrix
  • Ability to synchronize updates to physical data models with updates from the data architecture logical data models.
  • Ability to follow IBC configuration management standards (e.g. check-in and check-out procedures for models as defined by IBC standards).
  • Coordinating with DBA in implementing the Database changes and also updating Data Models with changes implemented in development, QA and Production

Environment: Alteryx, Teradata, Hive,Hadoop, Implala, SAP BW, Mongo DB,Netezza, Oracle 9i/10g/11g,Sql Server 2008/2012/2015 , DB2 LUW, DB2 ZOS, Unix AIX 5.2, Linux, ERWIN 7.3/8.28

Confidential, Chicago, IL

Data Analyst

Responsibilities:

  • Requirement gathering from business users to prepare an Interface Specification Document to be provided to various LOBs for sourcing data both transactional data and reference data.
  • Involved in gathering requirements from the Business Users in creating an effective data model to capture all the required data attributes.
  • Data Modeling for different domains in the staging layer.
  • Involved in performing a gap analysis on the existing data model of the staging layer versus the requirement on Data Management Portal.
  • Providing pseudo code to ETL development team for capturing exceptions as a part of DQ process.
  • Preparing S-T mapping document incorporating different business transforming logic for implementation of the ETL process.
  • Involved in the exception handling process by creating workflow to escalate it to the respective LOBs.
  • Worked closely with the Project Manager in creating the process swim lane.
  • Involved in a periodic reconciliation of data between the various source and the end target system and create a BO report on the reconciliation breaks to be displayed in the DM Portal.
  • Provide SQL queries to extract data from the Legacy System on alerted accounts and customers information to feed into the Portfolio Analyzer system.
  • Analyzing feed metrics data as a result of the ETL process to come up with the efficient SLA time to provide to the Operation team.
  • Involved in creating a detailed user manual on the DM portal applications to be provided to the users.
  • Data profiling using Informatica Analyst tool, generates score cards.
  • Involved in writing use cases for UAT testing and writing validation scripts using SQL.
  • Creating Data Model for Feed Management.
  • Worked on creating a Data Model for Compliance Regulatory Library and helped implement the SDLC process.
  • Brainstorming on possible BLT exceptions that can arise as a part of the ETL implementation.
  • Performed Data Analysis and Data validation by writing complex SQL queries using TOAD against the ORACLE database.
  • Worked on automation process to maintain Reference Data from various systems to refresh the data periodically.

Confidential, Wilmington, DE

Data modeler/Analyst

Responsibilities:

  • Extensively worked on various database impacts on CMS application once the impacted changes are known in project requirements review.
  • Extensively worked with Model maintainance project where all the Data models which are not toupto standards making them standardized
  • Writing implementation plans in order to implement the database changes into QA and Production environment
  • Coordinating with DBA in implementing the Database changes and also updating Data Models with changes implemented in development, QA and Production
  • Created Data dictionary using ERWIN modeler.
  • Involved in Low level Design for the scripts of the database sequences, constraints, triggers and stored procedures.
  • Knowledge of Merging several flat files into one XML file.
  • Extensively worked on creating, Altering and Deleting the Tables in different Development Environments and also Production.
  • Configure and working with multi nodes Hadoop cluster, Installed Cloudera, Apache Hadoop, Hive, Pig and Spark and commissioning & decommissioning of datanode, namenode recovery, capacity
  • Installed and configured ApacheHadoopto test the maintenance of log files in Hadoop cluster.
  • Continuous monitoring and managing theHadoop clusterthroughCloudera Manager.
  • Perform architecture design, data modeling, and implementation of SQL, Big Data platform and analytic applications for the consumer products.
  • Managing SQL and Hadoop cluster, adding and removing cluster nodes, cluster monitoring and troubleshooting, manage and review data backups, manage and review Hadoop log files and manual fail over to check cluster functioning.
  • Querying Spark code using Scala and Spark-SQL for faster testing and data processing.
  • Extensively involved in various data modeling tasks including forward engineering, reverse engineering, complete compare, creating DDL scripts, creating subject areas,data dictionary, publishing model to PDF and HTML format, generating various data modeling reports etc..
  • Extensively worked with logical models to change them into complete physical models from where we can implement these models into database.
  • Designed a Conceptual Data Model, Logical Data Model, and Physical Data Model using Erwin.
  • Extensively worked with developers on analizing the impacterd changes on respective applications based on which design approach is taken and also same chages are implemented in database with help of DBA.
  • Created physical models for the applications which don’t have any physical models from logical models.

Environment: Oracle 9i/10g/11g,Sql Server 2005/2008/2008 r 2/2012/2014/2016 , Hadoop,Toad 8.6 for Oracle, DB2, Unix AIX 5.2, Linux, ERWIN 7.3/, Flat Files, Hyperion 8.3.

Confidential, Eden Prairie, MN

Data Analyst

Responsibilities:

  • Participated in Requirement Gathering, Business Analysis, User meetings, discussing the issues to be resolved and translating user inputs into ETL design documents.
  • Worked on the Data Warehouse team analyzing data files that had to be compiled from disparate non-production sources and readied them for production. Tasks included: comparing data to requirements documentation, creation of data layouts, data dictionary, and the pipe delimited and fixed width files. In addition, was team lead for managing the completion and loading of these files for the Data Management group
  • Developed Loding Scripts for Terradata database using Fast Load Utility
  • Participated as member of a project team performing various lifecycle tasks for an intranet application with Oracle database for tracking public offering orders. Developed logical and physical database models and data dictionary documentation. Worked as liaison between users and developers refining and explaining user requirements.
  • Worked with Terradata utilities Fast Load to develop loading scripts to mock the Data
  • Assess data quality requirements in terms of data completeness, consistency, conformity, accuracy, referential integrity, duplication and evaluate vendors (Informatica Data Explorer/Data Quality, identify and measure data quality, design and implement data profiling and data quality improvement solution to analyze, match, cleanse, and consolidate data before loading into data warehouse.
  • Develop data quality reports/dashboards with Cognos Framework Manager and Report Studio.
  • Designed the relational data model for operational data store and staging areas, Designed Dimension & Fact tables for data marts.
  • Extensively involved in various data modeling tasks including forward engineering, reverse engineering, complete compare, creating DDL scripts, creating subject areas, creating DDL scripts, data dictionary, publishing model to PDF and HTML format, generating various data modeling reports etc..
  • Designed a Conceptual Data Model, Logical Data Model, and Physical Data Model using Erwin.
  • Worked as a Data Warehouse and Database Developer. Core responsibility is to interview with customers and subject matter experts to gather the requirements for a proposed database analyze these requirements and source systems required to build the data warehouse using Data Warehouse Builder.
  • Data Architect in a centralized group doing logical and physical data models in ERWIN. Also, wrote database interface specifications and documented in Data Manager data dictionary.
  • Designed Informatica ETL Mappings documents, created ETL staging area framework, created data mapping documents, data flow diagram, ETL test scripts etc.
  • Extensively used Erwin and Normalization Techniques to design Logical/Physical Data Models, relational database design.
  • Experience with utilities Informatica Data Quality (IDQ) and Informatica Data Explorer (IDE).
  • Conducted data modeling sessions for different user groups, facilitated common data models between different applications, participated in requirement sessions to identify logical entities.
  • Extensively involved in various data modeling tasks including forward engineering, reverse engineering, complete compare, creating DDL scripts, creating subject areas, creating DDL scripts, data dictionary, publishing model to PDF and HTML format, generating various data modeling reports etc..
  • Extensively involved in creating, maintaining, updating documents like requirement document, database design document, System design document, naming standard procedures, SOPs etc.
  • Analyzed and optimized the existing business processes using Conceptual Models and Data Flow Diagram. The process included performance enhancement requirements assessment and mapping of existing work flow and data flows.

Environment: Erwin,, DB2v7, XML/XSD, Qwest Center, MS SQL Server, Teradata, RedHat Linux Shell Scripts, PL/SQL, IBM Control Center,Informatica Power Center v 8.6.1, Cognos v8, Tunnelier, Test Track Pro, TOAD, PVCS, Microsoft Project Plan

We'd love your feedback!