We provide IT Staff Augmentation Services!

Senior Business Intelligence Developer Resume

SUMMARY

  • 8 years of experience in Architecting and implementing enterprise, business & OLAP/OLTP solutions in Healthcare /Insurance/Banking Domains
  • Designing and development in the field of Data Warehousing, management and Business Intelligence Technology and have performed multiple roles including Dev Lead, ETL Designer/Developer and Quality Analyst.
  • Experience in Development and Deployment of DTS, SSIS packages from MS - Access, Excel and usage of multi-environment sources of Oracle10g,Oracle9i, SQL server, Teradata, DB2, mainframe, SFTP, Sybase and formation of UNIX shell Scripts .
  • Management and implementation of database models, data flow diagrams, database schemas, dB scripts, DTD schemas to support a robust data management
  • Experienced in programming tasks-Stored Procedures, Triggers, Indexed Views, User Defined Functions and Cursors using SQL Server 2008/2012 with T-SQL.
  • Have extensive knowledge of RDBMS components such as usage of cursors and their importance in creation of automated Stored Procedures and generating Trigger.
  • Operational Knowledge of Informix IDS v 10,Abinitio,Informix XPS and data integrator
  • 7 years of extensive experience of working in SQL Server 2008 R2/2014, Parallel Data Warehousing,
  • 7 years of extensive experience in ETL Design, development using SSIS/Informatica/Ab-Initio, designing/modelling using SSAS, creating reports with SSRS, Tableau
  • Involved in migration of SSIS Packages 2012 CU5 and troubleshoot migration issues.
  • Have experience of working with SharePoint, SharePoint Designer, Visual Studio, and SQL Reporting Services, ASP.Net, and C #, C++, Adobe Photo Shop, PowerBuilder, Oracle, XML, Microsoft SQL, Microsoft Excel (creating reports, VBA, macros, ODBC), Access (running queries) and PowerPoint (create basic decks) and Sever, Crystal Reports.
  • Experienced in SDLC life cycle for Design, Development and Staging phases of the projects with support of Data-Flow, Process Models and E-R Diagrams.
  • Experience with tracking defects and testing unit test cases with HP quality center
  • Also proficient in Installation, Configuration and Updates of SQL Server and Deployment of SSRS using Scale-Out, Scale-Up and Local topologies.
  • Designing and Deployment of Reports for the End-User requests using Web Interface & SSRS.
  • Experience in writing functional requirements, Design documents, ETL specifications, Unit Test Cases and System Test cases.
  • Experience in writing Quality Center scripts for Unit testing, Functional testing, IQ (Installation Qualification).
  • Involved in Huge data migrations, transfers using utilities like Data Transformation Services (DTS), and SSIS, Bulk Copy Program (BCP) and Bulk Insert.
  • Extensive experience in Installation, Configuration and Deployment of SSRS using Scale-Out, Scale-Up and Local topologies.
  • Possess strong interpersonal skills, skilled problem solver and an efficient team player with excellent communication.
  • Proven technical leadership to manage global delivery with large teams in an onshore-offshore model.
  • Experience in handling teams across US and India under multivendor platform.

TECHNICAL SKILLS

ETL Tools: MSBI (SSIS, SSAS, SSRS), Informatica Power center 9.5, Datastage,Ab-initio (3.16)

Databases: SQLserver2008R2,MySQL,Oracle11g,Teradata16.0,Sybase,DB2

Data modeling Tools: Microsoft Visual basic C++, Microsoft Visio 2003, SSAS (OLAP), MS-Excel, SharePoint, Hive, Pig and Hadoop Streaming, MapReduce

Languages: PL/SQL, T-SQL, I-SQL, C#, Unix/Perl Scripting, XML

Reporting/Visualization Tools: SSRS, Tableau, Business Objects

Methodologies: SDLC,Agile

OS: Windows server 2003, UNIX, Windows NT/XP/2000/Vista/7

Other Tools: DWH Concepts Dimension modeling, star schema, snow flake schema

PROFESSIONAL EXPERIENCE

Confidential

Senior Business Intelligence Developer

Responsibilities:

  • Designing and implementing enterprise & business segment BI solutions.
  • Defining, designing, developing and optimizing ETL processes for enterprise data warehouse 24*7 support
  • Working with Cerner and design and analysis of data via TABLEAU.
  • Performing extensive gap analysis in the project to deal with as-is and to - be conditions.
  • Building, publishing customized interactive reports, dashboards, views, report scheduling/subscribing via Tableau.
  • Created action filters, parameters, calculated fields, sets, grouped fields for preparing tableau dashboards.
  • Extensive T-SQL programming and creation of data marts as data repository for creation of data models and ETL Development.
  • Being in healthcare organization restricted data for particular users, using row level security.
  • Developed and developing dashboards/workbooks from multiple data sources using data blending.
  • Trained around 230 users on tableau and provide 24/7 support for tableau end users.
  • Developed tableau financial/revenue dashboards to enable yearly, quarterly and MTD type analysis.
  • Also created dashboards to handle HR data and show the data in much interactive form and also enabled them viewing data behind using crosstab.
  • Work environment primarily comprises of PDW, SSIS, SSAS, SSMS and Windows.
  • Deal with around 400TB of data and handling a team of 4.
  • Manage and develop SSIS packages and TABLEAU reports of clinical, financial, HR, ER, Transplant data.
  • Working on PDW control Nodes and Landing Zone creation, to migrate the tables to PDW from SQL Server using BCPs through DW loader scripts.
  • Maintaining EHR (Electronic health records) data and Cerner.
  • Efficient in forming MDX queries and formation of Charts and Pivot table/Chart creation.
  • It includes creation of Dashboards, Pivot charts and MDX queries.
  • Create data models using SSAS out of the healthcare Cerner data deposited in EDW.
  • Mainly Encounters, Diagnosis & Procedure facts with Dimensions: - Dim Date, DimICD9, and Dim Location etc.
  • Proficient in usage of loops (while, for and cursors)
  • Big data and Hadoop trained, slowly merging into it .
  • Installed and configured Apache Hadoop, Hive and Pig environment on the prototype server.
  • Configured SQL database to store Hive metadata.
  • Good Hands on experience in Hive SQL.
  • Loaded unstructured data into Hadoop File System (HDFS).
  • Created reports and dashboards using structured and unstructured data.
  • Also have strong hold on python scripting.
  • Generated scripts for OLAP database backup and developed a scheduler using SQL server agent job.
  • Migrating existing SSRS report to Tableau and providing training to users on Tableau usage.
  • Created dashboard reports and workbooks for various divisions of healthcare such as finance, Transplant, OR, ED/ER in Tableau.

Tools: - Visual studio 2008/2015, SSIS, SSRS/Tableau, SSAS, PDW, UNIX, Windows & Power-shell scripts.

Confidential

onsite lead coordinator

Responsibilities:

  • Worked as onsite lead coordinator for a team of 20, acting as a liaison between the offshore team and the client.
  • Efficiently adhered and implemented SDLC (System Development Life Cycle).
  • Involved in Technical decisions for Business requirement, Interaction with Business Analysts, Client team, Development team, Capacity planning and Up gradation of System Configuration.
  • Created Mapping Specification document as per requirements.
  • Implementation and delivery of Informatica platform solutions to develop and deploy ETL, analytical, reporting and scorecard / dashboards on Sybase/Teradata using SSAS, Datastage, Business Objects.
  • Developed, deployed and monitored Data stage Jobs for new ETL Processes and upgraded the existing Jobs for the on-going ETL Processes.
  • Used data stage tool to load data from Sybase to Teradata as a part of migration project.
  • Monitored Full/Incremental/Daily Loads and support all scheduled ETL jobs for batch processing.
  • Interacted with different system groups for analysis of systems.
  • Involved in Installation, Configuration and Deployment of Reports using Tableau
  • Developed automated stored procedures for calculating line of credit and risk in US and Non-US markets.
  • Operated in Teradata Configuration, Administration, Implementation and Trouble-shooting for Business work.
  • Complete end to end involvement in migration of database from Sybase to Teradata.
  • Completely involved in monitoring and carrying out daily code drops and archives on the production databases during maintenance windows.
  • Developed automated scripts for migration of data from Sybase to Teradata and vice versa through UNIX.
  • Performed Regular Database Maintenance process, checking the disk Defrag procedures in different environments.
  • Performed Database Refresh tasks from production to Development and Staging Servers
  • Performed Database consistency checks with DBCC, Defrag, Index tuning and monitored error logs
  • Database Performance of Index tuning with using Database Engine Tuning Advisor to resolve Performance issues
  • Developed and optimized database structures, stored procedures, Dynamic Management views, DDL triggers and user-defined functions.
  • Involved in writing Stored Procedures and Functions, Cursors to handle database automation tasks
  • Created database artifacts in Teradata, according to the requirements.
  • Used Teradata utilities like Multi-load, T-pump, and Fast Load to load data into Teradata data warehouse from Sybase and also created a tool for data movement between development, test and production environment of Teradata.
  • Solved Various Defects in QC.
  • Played a major role in setting up different environments - Testing, Integration, Performance and Production environment
  • Assisted in Batch processes using Fast Load, BTEQ, UNIX (Korn Shell scripting) and Teradata SQL to transfer cleanup and summarize data.
  • Created Informatica workflows to balance load time keeping in consideration the sequence of load as per business requirements.
  • Performance Tuning of sources, Targets, mappings and SQL queries in transformations.
  • Developed unit test plans and involved in system testing.
  • Wrote Build plan with steps to implement the code in multiple environments.
  • Functional testing and Unit testing on tables loaded in Teradata. Preparation of test results report to ensure Quality
  • Preparation of process related documents as required throughout the SDLC lifecycle such as System requirement detail specification document, ETL Specification Documents, Test specification documents, Performance Document, capacity plan document, traceability Matrix, Teradata estimate document.

Confidential onsite lead coordinator

Responsibilities:

  • Involved in creation/review of functional requirement specifications and supporting documents for business systems, experience in database design process and data modelling process.
  • Leading the effort for creating Data Quality check (DQ on Hops) process (DQC and Event Engine Nodes) and automated process run through an internal scheduler(event-engine)for Card(Global corporate payments and OPEN) and Non Card Portfolios, Amex internal tool for validating the data quality coming from the source as an additional check before performing load to target tables.
  • Leading the effort for code release from offshore involving development, testing and till package formation for production implementation.
  • Gap Analysis for Corporate tables for production defects/ data mismatches between Sybase IQA5 and TD Amex5 and SQL Server (Source and Target).
  • Transferring data from Amex 5 to Amex 3 for creating suitable test environment, to carry out UAT and SIT, using automated script when the source was Teradata.
  • Developed PL/SQL triggers and master tables for automatic creation of primary keys
  • Created PL/SQL stored procedures, functions and packages for moving the data from staging area to DataMart
  • Created database objects like tables,views,materlized views, procedure and packages using oracle tolls like Toad,PL/SQL developer and SQL* plus.
  • Performed security tasks like GRANT, REVOKE permissions for security tasks.
  • Also framed performance tuned PL/SQL queries and Stored Procedures in addition to user defined functions to decrease CPU usage time and avoid Deadlocks between transactions...

Environment: UNIX, Oracle 9i/10g, Windows Server 2003.

Languages: PL/SQL, SQL, UNIX shell scripts

ETL tools: Informatica, Abinitio

Hire Now