Data Modeler/ Data Analyst Resume
ChicagO
CAREER SUMMARY:
- Over 3 Years of IT experience as Data Analyst, Data Modeler in Domains like Insurance and Retail. Proven ability in working with enterprise level large Data Warehouses, RDBMS systems, ETL and modeling tools/techniques.
- Good in system analysis, ER Dimensional Modeling, Database design and implementing RDBMS specific features.
- Experience in modeling with both OLTP/OLAP systems and Kimball and Inmon Data Warehousing environments.
- Having good experience with Normalization (1NF, 2NF and 3NF) and Denormalization techniques for improved database performance in OLTP, OLAP, Data Warehouse and Data Mart environments.
- Well - versed in designing Star and Snowflake Database schemas pertaining to relational and dimensional data modeling.
- A good expertise in Extract Transform and Load (ETL) data from spreadsheets, database tables and other sources using Microsoft Data Transformation Service (DTS) and Informatica
- Good working knowledge of Meta-data management in consolidating metadata from disparate tools and sources including Data warehouse, ETL, Relational Databases and third-party metadata into a single repository to get information on data usage and end-to-end change impact analysis.
- Having experience in writing complex SQL queries to perform end-to-end ETL validations and support Ad-hoc business requests. Also, good in developed Stored Procedures, Triggers, Functions, Packages using SQL/PLSQL.
- Strong knowledge and understanding of Data warehousing concepts: Star Schema, Snowflake Schema, Fact and Dimension Tables. Good knowledge of Dimensional Modeling and Normalization approaches.
- Proficiency, including architecture, with different RDBMS platforms like Oracle, Teradata and SQL Server.
- Some experience writing stored procedures and UDFs: user-defined functions.
- Experienced with Teradata utilities: BTEQ, FastLoad, MultiLoad, FastExport, and TPump. Familiar in writing Multi Load, FastLoad, FastExport and BTEQ scripts.
- Proficient in Data Quality Management techniques like Data Profiling, Data Cleaning, Data Integrity, Data Mining, Data Reference and Security etc.
- Experience with Tableau in analysis and creation of dashboard and user stories.
- Experienced in extracting data from flat files, and converting them into SAS datasets using SAS PROC IMPORT, PROC SQL, etc.
- Strong knowledge of Software Development Life Cycle (SDLC) including Waterfall and Agile models.
- Experience working with multiple/cross-functional groups, conducting brainstorming/root cause analysis sessions.
- Strong computing skills including Microsoft Office suite: Excel, PowerPoint, Access, Word, Microsoft Outlook.
- Proficiency in prioritizing and multitasking to ensure that assignments are completed on time.
- Demonstrated willingness, interest, and aptitude to keep learning new technologies and acquire new skills. Excellent communication and interpersonal skills, improved greatly as a result of daily interaction with BAs, DAs, stakeholders, and company management in defining and clarifying projects/business needs .
TECHNICAL SKILLS:
Requirements Management Tools: Rational ClearCase, Doors
Analysis/Modeling Tools: UML, JAD, RUP, Waterfall, Agile
Defect Tracking and Testing Tools: Quality Center, JIRA, HP ALM, Service Now
Operating Systems/Platforms: Windows 7/XP/2000/2003/NT, RHEL 5x/6x
Office Applications and other tools: MS Office Suite, Visio, SharePoint, MS Project
Databases: Teradata 13, MS SQL Server 2008 R2, Oracle 11g
DB Tools: SQL Plus, SQL Developer, Teradata SQL Assistant, Toad for Oracle and DB2, Mongo DB, SSMS, CA Erwin DM, Rational Rose, ER/Studio, MS Visio, SAP Power designer, Embarcadero, Hadoop, Podium
Programming languages: C, C++, Java, SQL, T-SQL, PL/SQL, SAS, Perl, Shell Scripting, JavaScript, Python
Reporting Tools: Crystal reports XI, Business Intelligence, SSRS, Business Objects 5.x / 6.x, Cognos7.0/6.0, Tableau, Informatica Power Center 7.1/6.2, Ab-Initio, Data Stage, Unica Campaign, Tivoli Workload Scheduler, Control-M
Web Technologies: H TML, DHTML, XML, CSS, JavaScript, Angular Js
PROFESSIONAL EXPERIENCE:
Confidential, Chicago
Data Modeler/ Data Analyst
Technologies: Oracle 11g, Python, Hadoop
Responsibilities:
- Support and maintain the Federal Employee Program - Care Coordination Technology Infrastructure.
- Involvement in SDLC which includes gathering of business requirements, architecture, design document, planning, development, deployment and maintenance of Care Coordination Technology Infrastructure.
- Gathering and analyzing the business and technical requirements. Create Functional Specification Document
- Attend program design, coding and test walk-through meetings to provide input regarding adherence to technical standards and customer requirements.
- Conducting Functional Requirement Specification Artifacts meetings with the Project Teams and analyzing the risks involved.
- Responsible for data mapping, metadata maintenance and enhancing existing Logical and Physical data models.
- Conducted sessions with architects, business SME and development teams to understand the Change Requests (CR’s), and effects caused on the system by the change requests.
- Determined data rules and conducted Logical and Physical design reviews with business analysts, development team and DBAs.
- Experience in Big Data technologies like Hadoop (HDFS systems) and it supported databases like hive, Mongo DB and NoSQL.
- Maintained and updated Metadata repository based on the change requests
- Analyzed existing Logical and Physical data models and altered them using Erwin to support enhancements.
- Conducted walkthroughs with the DBA to convey the changes made to the data models.
- Implemented SQL Scripts to modify the data to resolve assigned defects.
- Worked with the ETL team to document the transformation rules for data migration from OLTP to Warehouse environment for reporting purposes.
- Developed source to target mapping documents to support ETL design.
- Created tables, views, sequences, indexes, constraints and generated sql scripts for physical implementation.
- Reverse Engineered databases schema into ERwin Physical Data Model for reusability and then forward engineered them to Teradata using Erwin.
- Resolved the data inconsistencies between the source systems and the target system using the Mapping Documents.
- Designed the data sourcing, data staging and ETL process. Mapped the data between the source and targets.
Confidential, New York, NY
Data Modeler/ SQL Developer
Technologies: Oracle 11g, Erwin, SQL Server
Responsibilities:
- Interacted with users and business analysts to gather requirements.
- Understood existing data model and documented suspected design affecting performance of the system
- Initiated and conducted JAD sessions inviting various teams to finalize the required data fields and their formats.
- Developed Logical and Physical Data models by using ERWin r7.0
- Created logical data model from the conceptual model and its conversion into the physical database design.
- Extensively used Star Schema methodologies in building and designing the logical data model into Dimensional Models
- Applied second and third NFR to normalize existing ER data model of OLTP system.
- Used Denormalization technique in DSS to create summary tables and improved complex Join operation.
- Gathered statistics on large tables and redesigned Indexes.
- Installed and configured SQL Server.
- Designed and developed the databases
- Written S0tored Procedures and Triggers extensively and working very closely with developers, business analyst and end users to generate various audit reports and troubleshoot their query problems and connectivity problems.
- Created integrity rules and defaults.
- Done analysis of resource intensive query and applied changes in existing back-end code (stored procedures, triggers) to speed up report execution
- Created ftp connections, database connections for the sources and targets.
- Maintained security and data integrity of the database.
- Developed several forms & reports using Crystal Reports.
- Provided maintenance support to customized reports developed in Crystal Reports/ASP.
- Done analysis of live site issues related to database and resolved the same.
Confidential
Jr. Data Analyst
Technologies: SQL Server 2008 R2, Teradata 11, TWS, ESP
Responsibilities:
- Involved in gathering, analyzing and documenting business and technical requirements. Analyzed project requirements to discover gaps, provide technical and functional solutions.
- Worked closely with BAs to gather and provide specific and crucial information through data mining using advanced SQL queries, OLAP functions etc.
- Discovered Data Lineage and read application material to understand workflows and complexity. Mined data from numerous source systems.
- Collected SQL queries and re-wrote them as necessary for pattern recognition and generation for events.
- Conducted workshops for user acceptance testing, training and support. Created Pivot tables in Microsoft excel for further analysis.
- Developed user test cases and validated test results during user acceptance testing (UAT).
- Created detailed documentation to explain the results achieved after performing validation.
- Performed data cleansing by analyzing and eliminating duplicate and inaccurate data.
- Proficient at working on LINUX/UNIX platforms and have good knowledge about Databases (SQL).
- Exceptional troubleshooting and problem-solving skills including TWS end-to-end including incident and problem management with responsibility for root cause analysis and resolution.
- Part of the U2L (Unix to Linux) Migration Project and successfully completed the migration of both the applications. Key resource in the process of the Configuring new TWS masters and DB2 server.
- Handling day to day troubleshooting and maintenance requests.
- Implementing and supporting a highly available and scalable TWS domain manager on LINUX in a TWS End-to-End configuration supporting Fault Tolerant Agents (FTA) on Windows and Linux and z-Centric Agents across Windows and Linux platforms.
- Developed Shell, Perl scripts to automate regular tasks