We provide IT Staff Augmentation Services!

Sr. Data Warehouse & Business Intelligence Developer Resume

2.00/5 (Submit Your Rating)

Baltimore, MD

SUMMARY:

  • About 7 Plus Years of experience in Planning, Analysis, Design, Implementation, Development, Maintenance and Support for production environment in different domains like Insurance, Healthcare, Financial, with a strong conceptual background in Database development and Datawarehousing Applications
  • Experience in developing ETL applications for Extraction, Transformation, and Loading into Data Warehouse/Data mart. Used Informatica Power Center 10.1/9.x/8.x/7.x Designer Tools (Source Analyzer, Warehouse Designer, Mapping Designer, Mapplet Designer, and Transformation Developer), Workflow Manager Tools (Task Developer, Worklet and Workflow Designer).
  • Successfully extracted Data from Different Databases like Oracle, DB2 and Sybase.
  • Expertise in Microsoft Integration Services SSIS.
  • Hands on experience in Erwin.
  • Strong Knowledge on concepts of Data Modeling Star Schema/Snowflake modeling, FACT & Dimensions tables and Logical & Physical data modeling.
  • Reporting experience using SSRS and Crystal Reports.
  • Experienced in Generating parameterized reports in SSRS 2012.
  • Reporting experience using MicroStrategy Desktop/Web
  • Excellent working knowledge of Data Mining Models (Classification, Association Rule Mining, Clustering, Genetic Algorithms)
  • Experience using Data Mining tools Microsoft Analysis Services.
  • Skilled in Analysis Services OLAP cube and view development in SSAS.
  • Vital Knowledge in OLAP Storage modes likes Rolap/Molap/Holap.
  • Expertise in Writing MDX Query in OLAP Databases.
  • Extensive knowledge on Informatica Client tools like Designer, Workflow Manager, Workflow Monitor, Repository Manager and Server tools - Informatica Server, Repository Server.
  • Experience in Developing and working with Relational Databases (RDMS) like Oracle 11g/10g/9i/8i/8.x/7.x, DB2, MS SQL Server 2005/20
  • Experience in I nstalling, Configuring, Managing, monitoring and troubleshooting SQL Server
  • Experience in Software Development Life cycle and developed Information Systems.
  • Creating SSIS packages that involves migration from legacy systems to centralized Database.
  • Proficient in using Profilers for both performance and auditing purposes.
  • Involved in migrating SSIS packages from 2008 R2 to 2012 in order to improve developer productivity, and simplify the deployment, configuration, management and troubleshooting of SSIS packages.
  • Experience in Extraction &Migration of Data from heterogeneous data sources like Excel, MS Access, Oracle, AS400, and DB2 to MS SQL Server by using Import Export Wizard, DTS SSIS utilities.
  • Resolved database performance issues, database capacity issues, replication, and other distributed data issues.
  • Creating indexed Views, complex Stored Procedures, effective functions, and appropriate Triggers to facilitate efficient data manipulation and data consistency.
  • Experience in Database Mirroring, Optimizing and Tuning of SQL Queries.

SKILL:

Relational Databases: SQL Server (2000, 2005, 2008, 2012, 2014, 2016), MS Access, Oracle 11g, 10g, 9i, 8i/7.x

Querying/Reporting: T-SQL, PL/SQL, SQL * Plus, SQL*Loader, TOAD, SSRS, Crystal Report, MicroStrategy

ModellingTools: ERwin

ETLTools: Microsoft Integration Services (SSIS), Informatica Power Center 10.1/9.x/8.x/7.x Designer Tools (Source Analyzer, Warehouse Designer, Mapping Designer, Mapplet Designer, and Transformation Developer), Workflow Manager Tools (Task Developer, Worklet and Workflow Designer).

Server Scripting: ASP, JSP

Data Mining Models: Classification, Association Rule Mining, Clustering

ProgrammingLanguages: C, C++,.Net

Operating Systems: Windows 2000/2003/2008/2012/2014/2016 , Linux

EXPERIENCE:

Confidential, Baltimore, MD

Sr. Data Warehouse & Business Intelligence Developer

Responsibilities:

  • As a DW/BI developer we designed and developed data flows from proprietary systems to EDGE server of Members data, claims, providers, RX data for Reinsurance and Risk adjustment under Affordable Care Act (ACA).
  • Developed data warehousing framework to implement Incremental loads/Historical loads by parameterized values of any given demand.
  • Developed framework to acquire data directly from Data Accusation layer daily.
  • ACA reporting design includes Repositories of subscribers, Member and account enrollment detail including current and historical demographics, product, health status, risk characteristics and revenue for all Confidential market segments (Individual Under 65, Individual Over 65, Small Group, Group 51-199, Group 200+, FEP, CFA).
  • Developed Mapplets, reusable transformations, source and target definitions and mappings using Informatica.
  • Scheduled Sessions and Batch Process based on demand, run on time, run only once using Informatica Server Manager.
  • Experience in designing and developing complex mappings applying various transformations such as lookup, source qualifier, update strategy, router, sequence generator, aggregator, rank, stored procedure, filter joiner and sorter transformations.
  • As a DW/BI developer we designed and developed data flows from proprietary systems to EDGE server of Members data, claims, providers, RX data for Reinsurance and Risk adjustment under Affordable Care Act(ACA).
  • ACA reporting design includes Repositories of subscribers, Member and account enrollment detail including current and historical demographics, product, health status, risk characteristics and revenue for all Confidential market segments (Individual Under 65, Individual Over 65, Small Group, Group 51-199, Group 200+, FEP, CFA).
  • Handled source system data from FACETS/ NASCO/FEPOC (Single Members, Group Data, Provider data, RX data) to load in to our proprietary systems with batch nightly jobs.
  • For the above, we implemented Reporting of error out records data from edge server and Quarterly Releases.
  • Used Informatica PowerCenter for extraction, transformation and load (ETL) of data in the data warehouse.
  • Extensively used Transformations like Router, Aggregator, Normalizer, Joiner, Expression and Lookup, Update strategy and Sequence generator and Stored Procedure.
  • Developed complex mappings in Informatica to load the data from various sources like XML, .csv, Database and WebServices.
  • Implemented performance tuning logic on targets, sources, mappings, sessions to provide maximum efficiency and performance.
  • Parameterized the mappings and increased the re-usability by designing and creating mapplets.
  • Used Informatica PowerCenter Workflow manager to create sessions, workflows and batches to run with the logic embedded in the mappings.
  • Created procedures to truncate data in the target before the session run, enable and disable index.
  • Extensively used Toad utility for executing SQL scripts and worked on SQL for enhancing the performance of the mappings and reporting logic.
  • Extensively used Informatica debugger to figure out the problems in mapping. Also involved in troubleshooting existing ETL bugs.
  • Used Informatica PowerCenter for extraction, transformation and load (ETL) of data in the data warehouse.
  • Extensively used Transformations like Router, Aggregator, Normalizer, Joiner, Expression and Lookup, Update strategy and Sequence generator and Stored Procedure.
  • Developed complex mappings in Informatica to load the data from various sources like XML, .csv, Database and WebServices.
  • Integrated Informatica Data Quality (IDQ) with Informatica PowerCenter and Created POC data quality mappings in Informatica Data Quality tool and imported them into Informatica PowerCenter as Mappings, Mapplets.
  • Generated Sub-Reports, drill down reports, Drill through reports and Parameterized reports using SSRS 2012.
  • Generated Reports using Global Variables, Expressions and Functions for the reports.
  • Used task factory components and transformation like Upsert destination, Update batch transformation, Null Handler Transformation, Surrogate Key Transformation, Advanced Email and SMS Task in SSIS packages.
  • Worked on Data migration project by validating their source data row by row and field by field.
  • Creating Report-Models for ad-hoc reporting and analysis.
  • Used bcp utility inside SSIS packages to import and export bulk data across servers
  • Developed SSAS cubes with multiple measures and dimensions on FACT data.
  • Developed partitions on dimensions for Cube reporting.
  • Designed and developed SSRS reports by sourcing cubes and Excel reporting and created custom reports in reporting tools such as Pyramid Analytics and Microstrtegy.
  • Designed and developed ETL strategies and mappings from source systems to target systems. ETL strategies were designed to cater initial and incremental load.
  • Implemented daily Auditing using SSIS and TSQL to make sure the data load processes are working properly

Environment : SQL Server 2005/2008/2012 , 2014/2016, Microsoft Visual Studio 2005/2008/2012 , WINDOWS 7, Linux, SSIS, SSRS, SharePoint 2010, Microsoft TFS(Team Foundation Server), . ASP.NET, MS Access, Microsoft Visio, Word, Excel, FACETS 5.1 system, SVN Repository Browser (Source Control), Informatica 9.6.1 HF3, NASCO, HP ALM, Informatica Developer 10.2/9.6, Informatica Workflow Designer 10.2/9.6, Informatica Monitor 10.2/9.6, Informatica Repository 10.2/9.x, Autosys Scheduler, UNIX, Teradata 13, Erwin 7.5, ESP, WinScp, Teradata 14.

Confidential, Kansas City, MO

Data Warehouse & Business Intelligence Developer

Responsibilities:

  • As an Application developer I was closely involved in the all the team meetings and gathering business requirements from stakeholders and providing a best possible solution based on the data perspective as documenting the process as required.
  • FACETS team involved in all the processes related to different vendors and working with them to provide extracts for member Eligibility, Enrollment, Compliance, Cost Recovery, Providers etc.
  • In dimension model provider model, subject area, we capture information on providers who participate in a provider network. Provider network fact table that identifies which provider are affiliated with which provider network.
  • Developed a Dimensional model that is designed to support flexible high performance reporting across all business cycles of a health plan.
  • We designed and developed largest fact table in these dimensions’ model in that it stores the highest volume of information. These fact table captures claim details at the claim level of grain. Many range of parameters such as providers, group, member, product, plan, class, line of business payee and date.
  • Also, worked with the FACETS Batch team in creating, updating and setting up batch jobs using Batch Wrapper execution.
  • Confidential uses these reports from the networks to update the providers, this update will assist with claims processing and batch adjudication and directions to all members by customer service. When providers were not updated correctly it delays claims from being processed, Members from being able to obtain service and backlog in all areas of Confidential .
  • These data model contains dimension table that include descriptive columns and fact tables that include measurements and foreign keys to all supporting dimensions and outrigger table
  • Developed Query for generating drill down reports in SSRS 2012 and Crystal Reports 9.0.
  • Developed reports and deployed them on server using SQL Server Reporting Services (SSRS) to generate all daily, weekly, monthly and quarterly Reports including current status.
  • Experience handling production data and Experience working on windows.
  • Experience with both physical and logical data backup.
  • Worked with the reports team in gathering requirements, building, creating and deploying SSRS reports for various departments at Confidential . Used Report2Web and Adhoc reports based on the requirements given by the business users.
  • Successfully corrected complicated extracts related to Dental and Enrollment processes and also involved in Data Analysis and Data integrity
  • Integrated Informatica Data Quality (IDQ) with Informatica PowerCenter and Created POC data quality mappings in Informatica Data Quality tool and imported them into Informatica PowerCenter as Mappings, Mapplets.
  • Designed and developed Technical and Business Data Quality rules in IDQ (Informatica Developer) and created the Score Card to present it to the Business users for a trending analysis (Informatica Analyst)
  • Applied slowly changing Dimensions Type I and Type II on business requirements.
  • Extensively worked on performance tuning, enhanced readability and scalability.
  • Worked extensively on .csv file, pdf and xml files and in isolating header and footer in single file.
  • Designed ETL packages dealing with different data sources (SQL Server, Flat Files, and XMLs etc.) and loaded the data into target data sources by performing different kinds of transformations using SQL Server Integration Services
  • Working knowledge in normalization, logical and physical data models.
  • Proficiency in distributed architecture and design, data replication, data security.
  • Proficiency in SQL Server optimization, fine-tuning, and scalability techniques.
  • Implemented data mart, facts, dimensions, star schema and OLAP cubes using SQL Server Analysis Service.
  • Created Multi-Dimensional Expression (MDX) for accessing OLAP cubes.
  • Involved in creating calculated members, named set, advanced KPI’S for the SSAS cubes.
  • Developed SSIS packages using various transformations like Lookup, Pivot, Derived column, Conditional split, Aggregate, Sort, Row Sampling, Union, Merge, Multicast.
  • Created complex SSAS cubes with multiple fact measures groups, and multiple dimension hierarchies based on the OLAP reporting needs.
  • Experience in creating Multidimensional cubes using SSAS and designing DW schemas.
  • Involved in implementing Key Performance Indicator (KPI) Objects in SSAS 2008. Create Calculate member in MOLAP cube using MDX in SSAS 2008.
  • Created calculated measures, dimension members using MDX, mathematical formulas and user defined functions, named calculations using SSAS for generation of cubes

Environment : SQL Server 2005/2008/2012 , 2014/2016, Microsoft Visual Studio 2005/2008/2012 , WINDOWS 7, Linux, SSIS, SSRS, SharePoint 2010, Microsoft TFS (Team Foundation Server), Dell-Boomi Integration, Boomi data Manager. ASP.NET, MS Access, Microsoft Visio, Word, Excel, FACETS 4.41 system, Informatica 9.6.1 HF3, NASCO, HP ALM, Informatica Developer 10.2/9.6, Informatica Workflow Designer 10.2/9.6, Informatica Monitor 10.2/9.6, Informatica Repository 10.2/9.x, Autosys Scheduler, UNIX, Teradata 13, Erwin 7.5, ESP, WinScp, Teradata 14.

Confidential, Birmingham, Alabama

Sr. DW/ETL SQL Developer

Responsibilities:

  • Designed ETL processes for extracting, transforming and loading of OLTP data into a central data warehouse
  • Integrated new systems with existing data warehouse structure and refined system performance and functionality.
  • Performed physical and logical modeling
  • Created SSIS packages to migrate data from heterogeneous sources such as MS Excel, Flat Files, CSV files and then transform and load into the data mart.
  • Responsible for enhancement of database performance using database performance and tuning optimization tools and techniques
  • Extensively worked on DTA, DMV's, Online indexing, Data Partitioning and tuning of long running stored procedures and complex queries designed the data warehouse structure, data modeling and developed it using the Erwin tool, MS Visio.
  • Used Data stage to manage the Metadata repository and for import /export for jobs.
  • Worked with Connected and Unconnected Stored Procedure for pre-& post load sessions
  • Designed and Developed pre-session, post-session routines and batch execution routines using Informatica Server to run Informatica sessions.
  • Responsible for mentoring team members, helping with assignments, providing guidance and technical oversight, and making key decisions on the technical design
  • Developed tools to automate base tasks using Python, Shell scripting and XML.
  • Extensively used ETL Tool Informatica to load data from Flat Files, Oracle, SQL Server, Teradata etc.
  • Developed data mappings between source systems and target system using Mapping Designer.
  • Developed shared folder architecture with reusable Mapplets and Transformations.
  • Designed and Developed pre-session, post-session routines and batch execution routines using Informatica Server to run Informatica sessions.
  • Worked with several facets of the Informatica PowerCenter tool - Source Analyzer, Data Warehousing Designer, Mapping & Mapplet Designer and Transformation Designer. Development of Informatica mappings for better performance.
  • Responsible for Performance Tuning at the Mapping Level, Session Level, Source Level and the Target Level for Slowly Changing Dimensions Type1, Type2 for Data Loads.
  • Configured the sessions using workflow manager to have multiple partitions on Source data and to improve performance. Understand the business needs and implement the same into a functional database design
  • Created the Shell Scripts to automate the execution of the SQL subprograms and to move the data to store in historical/archive folders.
  • Heavy SQL/Database developer experience in writing efficient adhoc SQL queries, PL/SQL Scripts, stored procedures fine tuning queries, and writing adhoc SQL queries
  • Developed reports and deployed them on server to generate all daily, weekly, monthly and quarterly Reports including status.
  • Generated Sub-Reports, drill down reports, Drill through reports and Parameterized reports using SSRS 2012.
  • Generated Reports using Global Variables, Expressions and Functions for the reports.
  • Created ad-hoc reports with incorporated filters, sorting etc and presented them at the client side to note the changes they may need.

Environment: MS SQL Server Business Intelligence Development Studio 2005/2008/2012 Linux, SSIS, SSAS, SSRS, XML, Oracle, MS Visual Studio, SQL Server Management Studio, Power BI, SharePoint 2013, ASP.NET, T-SQL, IBM DB2. Informatica 9.6.1 HF3, NASCO, HP ALM, Informatica Developer 10.2/9.6, Informatica Workflow Designer 10.2/9.6, Informatica Monitor 10.2/9.6, Informatica Repository 10.2/9.x, Autosys Scheduler, UNIX, Teradata 13, Erwin 7.5, ESP, WinScp, Teradata 14.

Confidential, Charlotte, NC

BI Developer

Responsibilities:

  • Worked on Customer and accounts segmentation codes for bank customers. With this implementation customers are getting better deals and more reward money.
  • Worked on settlement Process for Rewards Payments.
  • Interacted with the business users in gathering requirements and responsible for developing end to end BI solution using SQL Server Reporting Services
  • Used ERWIN to develop Logical and physical data models and mapped the data into database objects
  • Developed SSIS packages using various transformations like Lookup, Pivot, Derived column, Conditional split, Aggregate, Sort, Row Sampling, Union, Merge, Multicast.
  • Created and maintained databases and various database objects like tables, views, stored procedures, triggers, indexes, constraints using advanced SQL (Structured Query Language) programming
  • Performed Code reviews, helped with QA testing
  • Created pre-prod testing environments by sourcing data using change data capture
  • Used Joins and sub-queries for complex queries involving multiple tables from different databases
  • Optimized the database by creating various clustered, non-clustered indexes and indexed views
  • Implemented daily Auditing using SSIS and TSQL to make sure the data load processes are working properly
  • Used Visual Team Foundation server for version control, source control and reporting.
  • Write Complex Queries, stored procedures, Batch Scripts, Triggers, indexes and Functions using T-SQL
  • Implemented data mart, facts, dimensions, star schema and OLAP cubes using SQL Server Analysis Service.
  • Created Multi-Dimensional Expression (MDX) for accessing OLAP cubes.
  • Involved in creating calculated members, named set, advanced KPI’S for the SSAS cubes.
  • Created calculated measures and dimension members using multi-dimensional expression (MDX), mathematical formulas, and user-defined functions.
  • Defined the Referenced Relationships with the measure groups and fact tables.
  • Implemented MS Analysis Services setup, tuning, cube partitioning, dimension design including hierarchical and slowly changing dimensions.
  • Create and optimize complex T-SQL queries and stored procedures to encapsulate reporting logic
  • Creating Report-Models for ad-hoc reporting and analysis.
  • Developed highly complex SSIS packages using various Tasks, Containers, Data transformations like Fuzzy Lookup, For Each Loop, For Loop Sequence Container, and FTP
  • Generated Reports using Global Variables, Expressions and Functions for the reports.
  • Identified and worked with Parameters for parameterized reports in SSRS 2012
  • Responsible for scheduling the subscription reports with the subscription report wizard.
  • Deployed the generated reports directly on the client’s workstation.
  • Improved Report Performance and performed unit testing.
  • Created multiple reports with drop down menu option having complex groupings within the report.
  • Generated matrix reports, drill down, drill through, sub reports, chart reports, multi parameterized reports.
  • Developed reports and deployed them on server using SQL Server Reporting Services (SSRS) to generate all daily, weekly, monthly and quarterly Reports including current status.

Environment: SQL Server Business Intelligence Development Studio 2005/2008/2012 , MS Visual Studio, ASP.NET, Oracle, IBM DB2, SQL Server SSAS/SSRS/SSIS, T-SQL, Linux, XML, MS Excel.

Confidential

SQL Server Developer

Responsibilities:

  • Involved in performance tuning (DB Tuning) and monitoring of T-SQL blocks.
  • Data Warehouse application development, support and maintain activities using MS SQL Server, SSIS, PL/SQL, Oracle database technologies.
  • Working with the functional analyst and users to understand needs and develop new functionalities and tools. Modeling, designing, developing, tuning, and testing the back-end structures to meet all requirements.
  • Analyze, program and test data warehouse code owned by applications team using Oracle, SQL server and .NET Technologies.
  • Support the existing data warehouse platform and application, maintaining its availability and scalability.
  • Designed ETL packages dealing with different data sources (SQL Server, Flat Files, and XMLs etc) and loaded the data into target data sources by performing different kinds of transformations using SQL Server Integration Services
  • Created SSIS packages for File Transfer from one location to the other using FTP task
  • Managed slowly Changing dimensions (SCD) and optimized fact table inserts
  • Worked on all activities related to the development, implementation, administration and support of ETL processes for large-scale Data Warehouses using SQL Server SSIS
  • Developed, deployed and monitored SSIS Packages including upgrading DTS to SSIS.
  • Business Intelligence Development Studio (BIDS) -- Using BIDS to create, edit, and deploy SSRS, SSAS, and SSIS projects.
  • Experience in writing SQL Queries for pulling large volume of records from Facets and claims database using stored procedures, and Extraction Transformation and Loading (ETL) process using SSIS.
  • Developed Query for generating drill down reports in SSRS 2008 and Crystal Reports 9.0.
  • Designed and created drill down reports in SSRS 2008.
  • Generated Sub-Reports, drill down reports, Drill through reports and Parameterized reports using SSRS 2012.
  • Generated Reports using Global Variables, Expressions and Functions for the reports.
  • Identified and worked with parameters for parameterized reports in SSRS
  • Created and scheduled a SSIS package to run our stored procedures to create the output report tables and regenerate reports automatically on weekly or monthly bases using SSRS

Environment: : MS SQL Server 2008, Visual Studio, T-SQL, SSIS, SSRS, SSAS, Oracle, PL/SQL, .NET, FACETS, SQL Server 2005

We'd love your feedback!