Sr.data Analyst Resume
Cincinnati, OH
SUMMARY:
- Over 7+ years of techno - functional experience, across Manufacturing, Banking & Finance, Healthcare, domains, in working with complex Big Data Business environments, Datawarehousing/Business Intelligence methodologies tools and technologies.
- Skilled in Solution Architecture, Enterprise Data Architecture, Data Quality, Master Data Management, Business Systems Analysis, Data Governance Framework, Meta data, Business Intelligence, Data Warehousing, Data flow analysis, Process flow analysis and mapping, Data Profiling, Development using ETL & BI tools and technologies, Development of business metrics & KPIs, Data Quality Metrics, Source-to-Target Mapping, source system analysis, suggesting solutions to complex big data and system problems.
- Strong experience in working with various source systems and systems of records.
- Experience using with project management.
- Proficient with ITIL, Agile and Waterfall methodologies for software solutions development.
- Six Sigma Rigor and Deployment in Software (DMADV, DMAIC).
- Strong analytical ability, presentation and communication Skills.
- Data Integration experience, including EDI, ETL, Data Warehouse, and Data Conversion.
- Data Acquisition, Collation, Information Dissemination and Access Strategy & development.
- Data warehousing/BI-OLAP/ETL Technologies ( SAP, Cognos®, Business Objects®, Informatica®), development, deployment, testing and support.
- Excellent technical and analytical skills with clear understanding of design goals of ER modeling forOLTPand dimension modeling forOLAP.
- Practical understanding of theData modeling(Dimensional & Relational)concepts like Star-Schema Modeling, Snowflake Schema Modeling, Fact and Dimension tables.
- Experience for collecting business requirements from customers, transform the requirements into DB data processes and data schema.
- Leading and directing user meetings for Scoping, gathering, collation and managing requirements for Analytical as well as regulatory reporting.
- Strong experience in preparing Business Requirements Document (BRD), and Functional Specifications.
- Strong experience in end user interaction at all levels (Mid Senior, Senior as well as Corporate) for analyzing Business Requirements, creating technical and functional specifications.
- Asserting Enterprise wide Best Practices in Systems Integration, BI, Data Warehousing and Big Data Management.
- Experience in architecting and managing AWS cloud infrastructure. involved in developing and implementation of the web application using Python.
- Developed views and templates with Python and Django's view controller and templating language to create a user-friendly website interface.
- Leading and directing BI Data Warehousing Tools and technology teams for testing, development and post production support.
- . Work through API DEVELOPMENT and deployment life cycle from initial swagger documentation to final production release and production support.
- Use DAX (Data Analysis Expressions) functions for the creation of calculations and measures in the Tabular Models.
- Experience in korn shell scripting.
- Create customized parameterized dynamic reports and dashboards using SSRS and MDX and links and pass the selected parameter values to its sub reports using SSRS. Make these accessible to the users in SharePoint 2013.
- Experience in development and maintenance ofSQL, PL/SQL, Stored procedures, functions, constraints, indexesandtriggers.
- Experience in working with huge data sets.
- Worked on the core and Spark SQL modules of Sparkextensively.
- Performed system design specification to clarify, refine, and revise system requirements
- Source System Analysis and ETL strategy.
- Strong hands on experience using the teradata utilities (SQL ASSISTANT, B-TEQ, Fast Load, MultiLoad, Fast Export, Tpump, Visual Explain, Query man) and Unix Shell (Korn) scripting.
- Experience with SQL programming languages.
- Implementing the Domain Driven Approach inJavafor making different project modules as independent which can be easily plug-in or plug-out with other modules for any further enhancements or changes.
- Experience in Designing, coding, testing and supporting the project, and developing middleware hub usingJava.
- Well versed in writing korn, bash UNIX Shell scripting.
- Wrote Python scripts to parse XML documents and load the data in database.
- Migrated existing Infrastructure of Guidewire products into cloud using Amazon AWS and its associated Services.
- Exposure to Dimensional Data modeling, Relational data modeling-Star Schema modelingandSnowflake modeling.
- Data movement, staging, cleansing and conforming strategy.
- Enterprise wide Data Governance, Data Management, Information Management and Analytics delivery.
- Data quality and Master Data Management
- Business Analysis, Dataflow analysis, BI Business Analysis, Data Profiling, Information Flow analysis, System analysis, Source Systems Analysis, Data Analysis, Feasibility Analysis.
- Creating Source-to-Target Mapping.
- Hands on Quality Processes, Quality Assurance, CMM, Software Development Life Cycle (SDLC) and Agile methodologies.
- UAT and testing strategy.
- Used different types of files like Flat, CVS, EXCEL files
- Experience Working with XML file structure.
- Experience in doing goof process and process redefinition process
- Extensively worked on reports that contained several Static and Dynamic prompts.
- Used tableau for creating dashboard, generating reports.
- Build SharePoint Team and Project Sites.
- Involved in Planning, Preparation, Migration and Clean up.
- SQL, RDBMS, MDDB (Oracle, Teradata, MS Sql Server)
- Design, Development, Testing and Deployment of BI Artifacts (ETL Jobs and Reports)
- Software Project Management and Delivery with Global user base and teams (offshore-onsite model).
- Six Sigma Rigor and Deployment in Software (DMADV, DMAIC).
- Good Presentation, verbal and written communication skills.
PROFESSIONAL EXPERIENCE:
Confidential, Cincinnati, OH
Sr.Data analyst
Responsibilities:
- Worked closely with a project team for gathering business requirements and interacted with business users to translate business requirements into technical specifications.
- Analyzed and reviewed the gathered requirement Business Requirement Document (BRD) and Functional Specification Document (FSD).
- Collected, managed and updated business requirementsdocument(BRD).
- Adept at documenting and transforming business rules and logic intoFunctional requirements and designing business models using Unified Modeling Language (UML) diagrams, Use Case scenarios.
- Analyzed the functional specs provided by the data architect and created technical specs documents for all the mappings.
- Actively managed and continuously improved the Enterprise, Operational, SOX, Market, model governance and IT risk for the Investment & Capital Markets Division; Models, Trading Desks and Back Office.
- Created data flow diagrams, data mapping from Source to stage and Stage to Target mapping documents indicating the source tables, columns, data types, transformations required and business rules to be applied.
- Created Test cases and Test scripts for System Integration Testing (SIT), User Acceptance Testing (UAT) and Unit Integration Testing.
- Perform extensive Oracle database programming and tuning of SQL queries.
- Develop batch processes for financial reporting applications and modules using Perl and Korn shell scripts on Oracle database, with partitions and sub-partitions.
- Worked with querying databases using SQL and Perl scripts.
- Used Heat Maps to identify different RISK Mediums across different internal departments.
- Designed Informatica mappings such as pass through, split, Type 1, Type 2 and used various transformation such as Lookups, Aggregator, Ranking, Mapplets, Connected and Unconnected lookup, SQL overrides etc.
- Implemented CDC (change data capture) for inserting and updating slowly changing dimension tables in target for maintaining the history data.
- Imported data from Kafka consumer group into Apache Spark through SparkStreaming APIs.
- Support annual budget, forecasting and conducted monthly analysis of keybusiness driversto support for management and business partners metrics.
- Analyzed contributes to cost reduction and increase based on data provided by CM
- Validated ETL mappings and tuned them for better performance and implemented various Performance and tuning techniques.
- Validated Job flows and dependences used by TIDAL scheduler to run informatica workflows and other shell scripts and store procedures.
- Implementing the Domain Driven Approach in Java for making different project modules as independent which can be easily plug-in or plug-out with other modules for any further enhancements or changes.
- Extracted large volumes of data feed from different data sources, performed transformations and loaded the data into various Targets.
- Used Python based GUI components for the front end functionality such as selection criteria.
- Worked on Teradataloading utilities Multi load, Fast load and Fast Export.
- Performed Unit Testing at various stages by checking the data manually.
- Performed Data Analysis and Data validation by writing SQL queries.
- Participated in the Incident Management and Problem Management processes for root cause analysis, resolution and reporting.
Environment: Rational Unified Process (RUP),Tableau, Rational Suite (RequisitePro, ClearQuest, Clearcase), MS Visio, MS Excel, MS Project, Oracle 9i, DB2,Mainframe, COBOL, Share point, MS Power Suite, Informatica, Mercury Test DirectorMS SQL Server 2005, Rational Rose and Visual Basic,brio.
Confidential, Philadelphia, PA
Reporting Data Analyst
Responsibilities:
- Design conceptual data models by analyzing the data requirements needed to support the business processes
- Create data mining structures and mining models, using either relational data sources or multidimensional data in cubes.
- Creating a data mining structure and an initial related mining model, and includes the tasks of selecting an algorithm type and a data source, and defining the case data used for analysis.
- Logical data modeling (star schema and snowflake schema) using Oracle BI tool
- Transformation of the logical data model to a physical data model and working with DBAs to create the data warehouse.
- Extract data from various sources. Clean and transformed data to load into reporting database using Oracle 11g database via SQL Developer, TOAD or SQL client worksheet GUI.
- Perform data analysis on large volumes of data to identify duplications, data anomalies, missing data, etc.
- Ensure the successful daily load of raw data to the server.
- Design in Database Design, Entity-Relationship Modeling, Dimension Modeling, Star Schema & Snowflake schema.
- Write SQL queries effectively and efficiently - including inner/outer joins, inserts, and table creation.
- Write SQL queries to run ad-hoc report as requested by management.
- Create reports in Excel - including Pivot tables, complex calcs, and conditional formatting.
- Perform data analysis and data modeling to support the business user’s needs.
- Backup databases and test the integrity of the backups.
- Develop dashboards and analytics reports on OBIEE BI platform.
- Set-up reporting environment, develop strategies for data collection and automation, model the data environment and implement the solution.
- Effectively interacted with executive level clients.
Confidential
Reporting/Business Analyst
Responsibilities:
- Managed MCI’s Arlington center marketing MS-SQL 6.5/7.0 databases that are used by marketing staff for employee reporting purposes
- Using MS SQL 6.5-7.0 server, NT workstation and NT server to maintain local databases, process checks and data verification.
- Assessed existing and available data warehousing technologies and methods to ensure our Data warehouse/BI architecture meets the needs of the business unit and enterprise and allows for business growth. Provide recommendations on evolution of the architecture.
- Export/ Import raw data from the production database to marketing center’s database.
- Ensured the successful daily load of raw data to the server.
- Created ad-hoc reports using SQL and Access requested by management
- Built Analytics reports and dashboard framework
- Developed customized reports at the request of the MCI Business Management team. These reports are used to determine productivity and commissions for marketing staff with daily, cyclical and monthly reports
- Effectively developed SQL queries to produce reports on customer service goals, contests, recognition and adjust employee’s hierarchy. In addition, support function such as creating tables, inserting data, updating and producing ad-hoc report
- Assisted the call center’s staff with all aspects of reporting and database issues
- Backup databases and test the integrity of the backups.
- Maintained database tables with hierarchy and goal data/information
- Preparation of unit test cases and production migration strategy
- Post Production support and enhancement work
- Troubleshooting, updating Remedy tickets for issues related to MCI system, reporting, sales adjustment, hierarchy, goal, control desk and other ad-hoc customer service issues.
- Used Excel to create spreadsheets for submitting bonuses to over 500 employees. Ensuring process and accuracy is completed within dead lines.
- Used Excel to export data and using CSV file to insert data into centers database.
- Created Excel formulas and Excel Macros to create centers goal and analyze historic goals.
Confidential, New York City
Data Analyst
Responsibilities:
- Gathered business requirements through interviews, surveys, prototyping and observing from account managers and UI (User Interface) of the existing Broker Portal system.
- Involved in testing data mapping and conversion in a server based data warehouse.
- Conducted JAD sessions with management, SME (Subject Matter Expert), vendors, users and other stakeholders for open and pending issues to develop specifications.
- Defined the calculations based on the reporting needs and the source data available
- Identify source systems, their connectivity, related tables and fields and ensure data suitably for mapping.
- Organized Joint Application Development (JAD) sessions with data steward and ETL teams while having walkthroughs for mapping documents. create team specific Agile process flow in JIRA to move tasks from one activity to another.
- Worked closely with the Enterprise Data Warehouse team and Business Intelligence Architecture team to understand repository objects that support the business requirement and process.
- Reviewed the data model and reporting requirements for Cognos Reports with the Data warehouse/ETL and Reporting team.
- Performed data modeling to correct problems with the current Sybase database. In addition, optimized current stored procedures and triggers to expedite the loading process.
- Database table review and data mapping for large scale data conversion project Oracle database to Mainframe.
- Created Data Dictionary and ER diagrams for data mapping purposes.
- Developed data mapping documents between Legacy, Production, and User Interface Systems.
- Use of data transformation tools such as DTS, SSIS, Informatica or Data Stage.
- Developed business requirement specification documents as well as high-level project plan.
- Loading staging tables on Teradata and further loading target tables on SQL server via DTS transformation Package.
- Excel downloads of summary data from SQL queries for further summarization and reporting with Excel lookups,Pivottables, graphs and formatting results appropriate for the particular internal consumer
- Identified/documented data sources and transformation rules required populating and maintaining data warehouse content.
- Experience with data migration (ETL development), document data manipulation processes and scripts
- Conducted User Acceptance Testing on the application - resolved issues from the participants, prepared and submitted Test Analysis Reports and participated in Product Readiness Review.
- Prepared the documentation of Data Acquisition and Interface System Design.
Environment: Windows NT 4.0, Oracle 9i, ETL, Teradata, Perl, Sql Server2005, Oracle Designer, PL/SQL, XSD, SSIS, DOS and UNIX shell scripting, Sequential Files.
Confidential - Pittsburgh, PA
Data Analyst
Responsibilities:
- Involved as a developer for the commercial business group data warehouse.
- Development of source data profiling and analysis, reviewed the data content and metadata would facilitate data mapping and validate assumptions that were made in the business requirements.
- Created logical and physical designs of the database and ER Diagrams for Relational and Dimensional databases using Erwin.
- Extracted data from relational databases Oracle and flat files.
- Developed complex transformations, Mapplets using Informatica Power Center 8.6.1 to Extract, Transform and load data into Operational Data Store (ODS).
- Lead, created and launched new automated testing tools and accelerators for SOA services and data driven automation built within our practice.
- Designed complex mappings using Source Qualifier, Joiners, Lookups (Connected and Unconnected) and Expression, Filters, Router, Aggregator, Sorter, Update Strategy, Stored procedure and Normalizer transformations.
- Ensured the data consistency by cross-checking sampled data upon migration between the database environments.
- Developed a process to extract the source data and load it into the flat files after cleansing, transforms and integrating.
- Designed SSIS packages to Extract, Transfer and load the (ETL) existing data into SQL server from different environments for the SSAS cubes.
- Extensively worked on developing and deploying applications using, SQL server 2005, Microsoft office, VBA and macros.
- Worked as an architecture and modeling teams and used Middleware SOA services.
- Performed data alignment and data cleansing and debugger to test the mapping and fixed the bugs.
- Created Sessions, Sequential and Concurrent sessions for proper execution of mappings in workflow manager.
- Provided SSRS and SSIS support for internal IT projects requiring report developments.
- Involved in System Integration Testing (SIT) and User Acceptance Testing (UAT)
Environment: Informatica 8.6.1, Oracle, SQL Server 2005, RDBMS, Fast load, FTP, SFTP.