We provide IT Staff Augmentation Services!

Data Engineer Resume

0/5 (Submit Your Rating)

Carrollton, TX

SUMMARY

  • Overall, 7+years of IT experience in Database Design, Development, Implementation and Support using various Database technologies (SQL Server 2008/2012, 2016 T - SQL, Azure Big Data) in both OLTP, OLAP and data warehousing environments.
  • Excellent understanding of Relational Database Systems, Normalization, logical and physical data modeling and proficient in creating E-R diagram using Erwin and Visio.
  • Expertise inSQL ServerandT-SQL(DDL, DML and DCL) in constructingTables,Joins,Indexed views,Indexes,Complex Stored procedures,Triggers, and useful functions to facilitate efficient data manipulation and consistent data storage according to the business rules.
  • Expertise in working on Data Transformation Services (DTS), SQL Server Integration Services (SSIS), SQL Server Reporting Services (SSRS), SQL Server Analysis Services (SSAS), Informatica Power Center, Power BI, Star Schema, Snowflakes Schema, Data Cleansing, Data Scrubbing and Data Migration.
  • Experience in DevelopingSparkapplications usingSpark - SQLinDatabricksfor data extraction, transformation, and aggregation from multiple file formats for analyzing & transforming the data to uncover insights into the customer usage patterns.
  • Experience on Migrating SQL database toAzure data Lake, Azure data lake Analytics,Azure SQL Database, Data BricksandAzure SQL Data warehouseand controlling and granting database accessandMigrating On premise databases toAzure Data Lake storeusing Azure Data factory.
  • Experience managingBig Dataplatform deployed inAzure Cloud.
  • Implemented Copy activity, CustomAzure Data FactoryPipeline Activities for On-cloud ETL processing.
  • Expert in tuning the long running Queries and Stored Procedures by examining the execution plans.
  • Streamlined in Performance Tuning using SQL Profiler & Database Tuning Advisor (DTA) and Query Optimization.
  • Expertise in the concepts of Data Warehousing, Data Marts, ER Modeling, Dimensional Modeling, Fact and Dimensional Tables.
  • Experience in developing web applications by using Python, Django, C++, XML, CSS, HTML, JavaScript and jQuery.
  • Experience in analyzing data using Python, R, SQL, Microsoft Excel, Hive, PySpark, Spark SQL for Data Mining, Data Cleansing, Data Munging.
  • Strong experience in Extraction, Transformation and Loading (ETL) data from various sources into Data Warehouses and Data Marts using Informatica Power Center (Repository Manager, Designer, Workflow Manager, Workflow Monitor, Metadata Manger), Power Exchange, Power Connect as ETL tool on Oracle, DB2 and SQL Server Databases.
  • Good experience in developing web applications implementing Model View Control (MVC) architecture using Django, Flask, Pyramid and Python web application frameworks.
  • Proficient in SQLite, MySQL and SQL databases with Python.
  • Hands-on experience in handling database issues and connections with SQL and NoSQL databases like MongoDB, Cassandra, Redis, CouchDB, DynamoDB by installing and configuring various packages in python.
  • Implemented Cube Optimization Techniques such as Cube Partitioning, Aggregation Wizard, and Defining Hierarchies with the cube structures.
  • DevelopedU-SQL scriptsfor processing massive amounts of data in a very familiar environment which is mainly used forAzure Data Lake Analytics.
  • Worked on creatingAzure Data Factoryfor moving and transforming the data.
  • Worked withAzure Data Lake Storeto capture data of any size, type, and ingestion speed in one single place for different operation.
  • Expert in EDI and HIPAA testing privacy with multiple transaction exposure such as new Member enrolment 834 inbound, claim 837 inbound or claim adjudication, response to claim 835 outbound, 276 status inquiry277 response, 270 eligibility/271 benefits.
  • Validates professional, Institutional and dentist claims in FACETS, verify codes with description and requirements verify adjudication process of claim in FACETS.
  • Expertise in EDI X12 837 (Health care claim process), 835(Payment/remittance advice), 834(Benefit enrollment), 270/271(inquire/response health care benefits), 276/277(Claim status) transaction files as per HIPAA guidelines.
  • Solid Experience in working with Benefit Enrollment and Maintenance systems, Claims Adjudication and Membership systems.
  • Implementation of Release Management automation processes using Team Foundation Server(TFS), Visual Studio Team Services(VSTS)
  • Worked with various source systems such as Relational Sources Flat files XML Mainframe COBOL and VSAM files CSV files SAP Sources/Targets and Web Services etc.
  • Expertise in creating indexed tables with primary key, foreign key and composite key constraints to maintain referential integrity and to maintain performance.
  • Experience in upgrading SQL Server software to new versions and applying service packs and hot fixes and unattended Installation.
  • Good knowledge of System Development Life Cycle (SDLC).
  • Expertise in test strategy design based on the various types of testing including integration, function, system, End to End, load, stress and performance testing, Backend (Database) Testing.
  • Expertise in creating visualizations and reports using Power BI.
  • Building and publishing Power BI reports utilizing complex calculated fields, table calculations, filters, parameters.
  • Hands on experience in SQL Server DTS and SSIS (Integration Service) package design, constructing, and deployment.
  • Experience in managing Alerts, Operators and Jobs through SQL Server Agent.
  • Transformed data from one server to other servers using tools like Bulk Copy Program (BCP), Data Transformation Services (DTS) and SQL Server Integration Services (SSIS).
  • Experienced in maintaining Batch Logging, Error Logging with Event Handlers and Configuring Connection Managers using SSIS.
  • Worked with writing the data Extraction, Transforming and Loading (ETL) process to implement the hybrid data environment using SSIS and custom query.
  • Experience in enhancing and deploying the SSIS Packages from development server to production server by using package configurations.
  • Experience in Data Conversion and Data Migration using SSIS and DTS services across different databases like Oracle, MS access and flat files.
  • Experience inMicrosoft Visual C#in script component of SSIS.
  • Experience in Crystal Reports, Reporting Services for MS SQL Server and SQL Server Agent in Support Services.
  • Expertise in generating Ad hoc reports, tabular reports, Matrix reports, Drill-down reports, Cascaded reports, List reports, Parameterized reports, Sub reports & web reporting by customizing URL Access and mentoring end user report writers using SSRS. Skilled in sorting the data and defining subtotals for the reports.
  • Expert in creating Power pivots, power query, Power views and SSRS reports using tabular model (DAX), Matillion, Cubes (MDX) and SQL queries, SQLCQL.
  • Hands on experience in installing, configuring, managing, upgrading, and migrating, monitoring, and troubleshooting SQL Server.
  • Experienced in working with different sets of Linked Servers which connects from SQL to other RDBMS databases.
  • Experience in Documenting Existing Process, Creating and Analyzing Business Requirements, Functional and Technical Documents.
  • Strong ability to understand and analyze the client requirements with Excellent Communication, Analytical and Problem-solving skills.
  • Very good working knowledge of waterfall and Agile Software Development Life Cycles.
  • Self-motivated, diligent with the ability to quickly adapt to new environments and changes.

TECHNICAL SKILLS

Operating Systems: Azure Virtual Machice (VM), Windows Server 2016, Windows Server 2012, Windows 2008 R2,/2000 Server(64bit), Windows 2000 Advanced Server, Windows XP, Unix, LINUX. Shell Script

Databases: Azure SQL Database, Azure data warehouse, Azure Data azure data factory, Azure Data Synch, Elastic Pools, SQL Server 2017, SQL Server 2016, SQL Server 2014, SQL Server 2012,2008 R2, SQL 2005,2000, Azure SQL Database, RDBMS, Microsoft Azure VM, Business Intelligence (BI), Amazon Web Service (AWS), AWS Cloud Watch, Oracle11g, MYSQL 5.x, MS Access, OLAP, OLTP, AWS RDS, AWS Cloud, Microsoft Azure Cloud

Data Modelling Tools: Erwin, Star-Schema Modeling, Snowflakes Schema Modeling, FACT & Dimension Tables

ETL Tools: Azure Data Factory (ADF), Azure Database migration Service (DMS), ETL SQL Server Integration Services (SSIS), Matillion,SQL Server Reporting Services (SSRS), ETL Extract Transformation and Load., Business Intelligence (BI), BCP, Informatica Power Center

Reporting Tool: Power BI, COGNOS BI- Framework manager, Matillion,Query studio, Report studio, SSRS, SAS, SAP, MicroStrategy, Crystal Reports, Tableau.

Languages: T-SQL, SQLCQL, PL/SQL, SQL, C, C++, HTML, XML/XSLT/.Net/C#.

Source Control: Bit Bucket, Git, Team Foundation Server (TFS)

Methodologies: Waterfall, V Model and Agile scrum

PROFESSIONAL EXPERIENCE

Confidential

Data Engineer

Responsibilities:

  • To Identify, Analyze & Develop the appropriate source system for data attributes required by business stakeholders for analysis and reporting for BCBS AZ CMS compliance.
  • Solution design and development using ETL Strategy, PostgreSQL, SQLCQL, SQL Server, HDFS, MapReduce, Hive, Sqoop, Kafka, Python, Scala, and Spark.
  • Provide performance and tuning as part of application development, maintain the memory management to control the consumption of cluster resources.
  • Understand and interpret source and target system Data Models to determine table/column source for data attributes.
  • Reverse-engineer data models where limited or no documentation exists.
  • Perform research and interviews to identify possible intermediate locations of source system data (i.e., data warehouse, data lake, etc.)
  • CreatedETLjobs inMatillion& SSIS for data loads.
  • Created new and worked on existingMatilliontransformations and orchestrations for data loads and file extracts.
  • Develop end-to-end data mapping and business rules that show the transformation of source system data attributes into a standard set of business data attributes to be used for reporting and analytics.
  • Define and maintain meta-data for all data attributes; build and maintain data catalog for usability and sustainability.
  • Support logical and physical data modeling activities within Data Warehouse
  • Develop well-researched and well-articulated recommendations on data engineering approach and execution plan (considering legacy environment and future vision)
  • Build data pipelines and monitor job runs for data ingestion into Data Lake from various data outsourcing teams and validate the reports for regular business requirements.
  • For the data migration from l2 to l4 we use AWS Data Migration Services.
  • Oversee the monitoring of the data quality efforts within the organization and provides a central authority for the resolution on the data management issues.
  • Design database, data models, data warehouse applications and business intelligence (BI) reports using best practices and tools.
  • Constructed product-usage SDK data and Siebel data aggregations by using PYSPARK, Scala, Spark SQLCQL and Hive context in partitioned Hive external tables maintained in AWS S3 location for reporting, data science dash boarding and ad-hoc analyses.
  • Provide document creation, requirements gathering, analysis, validation of applications, system monitoring support.
  • Review the code changes of application frameworks designed with SQL, Java, Python, and design patterns.
  • Validate the code changes for production deployments and create the workflows to automate the regular job runs.
  • Performed in agile methodology, interacted directly with entire team provided/took feedback on design, Suggested/implemented optimal solutions, and tailored application to meet business requirement and followed Standards.

Environment: Dbeaver, SQLCQL, PL/SQL, PostgreSQL, Netezza, SQL Server, HDFS, MapReduce, Hive, Sqoop, Kafka, Python, Scala, Spark, AWS (EC2, S3, RDS, DMS, SQS, Redshift), PowerBI, git, Jira.

Confidential, Carrollton, TX

SQL/ETL Developer

Responsibilities:

  • Involved in gathering requirements from Business Users, Analysts by scheduling meeting at regular basis.
  • Install and configure new SQL Servers using SQL Server Management Studio (SSMS) and Visual Studio SSDT for ETL development.
  • Create and manage databases, schema objects using T- SQL such as the tables to filter the defects for the Loan Number for the current audit review month which refers to the main Encompass database, Indexes, Views, Triggers, Functions, and appropriate Stored Procedures to facilitate efficient data manipulation and data consistency.
  • CreatedIndexeson tables for new database to faster retrieval of data for the Reports and in the UI for performance.
  • Extensively used SQL scripts/queries for data verification at the backend.
  • Extract Transform and Load data from Sources Systems to Azure Data Storage services using a combination of Azure Data Factory, SQLCQL, Spark SQL, and U-SQL Azure Data Lake Analytics. Data Ingestion to one or more Azure Services - (Azure Data Lake, Azure Storage, Azure SQL, Azure DW) and processing the data in InAzure Databricks.
  • Primarily involved in Data Migration using SQL, SQL Azure, Azure storage, and Azure Data Factory, SSIS, PowerShell
  • Created C# applications to load data from Azure storage blob to Azure SQL, to load from web API to Azure SQL and scheduled web jobs for daily loads.
  • Used DDL and DML for writing triggers, stored procedures, and data manipulation.
  • Worked on writing the SQLCQL code for the historical data to pull according to the specification requirement.
  • Performance tuning of SQL queries and stored procedures using SQL Profiler and Index Tuning Wizard.
  • Troubleshoot any kind of data issues or validation issues.
  • Created stored procedures using Common Table Expression (CTE) and various types of UDF functions.
  • Developed complex SQLCQL queries, common table expressions (CTE), stored procedures, and functions used for designing SSIS packages.
  • Used C#, VB, Python Scripts in ETL Packages wherever necessary.
  • Design SSIS packages to bring data from existing OLTP databases to new data warehouse using various transformations and tasks like Sequence Containers, Script, For loop and Foreach Loop Container, Execute SQL/Package, Send Mail, File System, Conditional Split, Data Conversion, Derived Column, Lookup, Merge Join, Union All, OLE DB source and destination, excel source and destination with multiple data flow tasks.
  • DesignedSSIS Packagesto transfer data from various sources of the company into the database that was modeled and designed.
  • Designed dynamic SSIS Packages to transfer data crossing different platforms, validate data during transferring, and archived data files for different DBMS.
  • Extracted data from different source systems such as Oracle SQL Server DB2 Mainframes XML and Flat Files.
  • Created Master Package to run different other ETL in one Master package.
  • Perform SQLCQL tuning and optimizing queries for and SSIS packages.
  • Created SSIS process design architecting the flow of data from various sources to target.
  • Created Informatica mappings with SQLCQL procedures to build business rules to load data.
  • Developed Complex transformations Mapplets using Informatica to Extract Transform and Load Data into Data marts Enterprise Data warehouse EDW and Operational data store ODS.
  • Used Informatica Data explorer IDE and Data Quality IDQ to implement Data rule taxonomy repository managing data rules to standardize and remove duplication of underwriter’s data and reporting on the execution of rules against data.
  • Created different SSRS Reports based on the user’s requirement. All kinds like ad-hoc reports
  • Created Tablix Reports, Matrix Reports, Parameterized Reports, Sub-reports, Charts, and Grids using SSRS.
  • Wrote Parameterized Queries for generating Tabular reports, Formatting report layout, creating reports using Global Variables, Expressions, Functions, Sorting the data, Defining Data Source and Datasets, calculating subtotals and grand totals for the reports using SSRS.
  • Migrated and translate legacy reports into Power BI from SSRS environment.
  • Extract dataset from different data source including SQL Server, excel and flat file prepare datasets for Power BI.
  • Develop reports using MS Power BI, enhancing pivot tables and charts with power pivot.
  • Implemented several DAX functions for various fact calculations for efficient data visualization in Power BI.
  • Worked extensively in Power BI report creating tabular cube, creating data model for Power BI Report, Validating data quality of the report.
  • Experience in publishing the Power BI Desktop models to Power Bi Service to create highly informative dashboards, collaborate using workspaces, apps, and to get quick insights about datasets.
  • Incorporated filters to narrow down data presented and slicers for appropriate user interaction, conditional formatting to spotlight alarming or profitable numbers.
  • Created reports using crystal reports.
  • Worked with data models with complex relationships in Power BI and connected different data sources to Power BI desktop.
  • Published many reports and visualizations created to the Power BI service and thereby develop dashboards.
  • Created multiple data visualization solution for initial Power BI environment business preview.
  • Bug fixes in SSIS, SSRS and stored procedures, as necessary.
  • Involved in Data mapping specifications to create and execute detailed system test plans. The data mapping specifies what data will be extracted from an internal data warehouse, transformed, and sent to an external entity.
  • Testeddataflow by introducing staging, transaction, anddatawarehouseanddatamartarea. This design provided more control ondataflow anddatavalidation.
  • Scheduling the jobs to run for ETL using SQL Server Agent.
  • Played roles like production support of SSIS and SSRS jobs.
  • Created numerous simple to complex queries involving self joins, correlated sub queries, CTE's for diverse business requirements.
  • Extensive knowledge with Dimensional Data Modeling, Star schema/Snowflakes schema, Fact and Dimension tables.
  • Worked on Incremental Loads using concepts like Checksum and Column to Column compare.
  • Scheduling SSRS jobs and production support.
  • Designed SSAS Cube and developed MDX queries for Analysis.
  • Provisioned VMs with SQL Server on cloud utilizing Microsoft Azure and setting up communication with the help of endpoints.
  • Created builds and release pipelines in VSTS (Azure DevOps) and done deployments using SPN (secure endpoint connection) for implementing CI/CD.
  • DevelopedSQL queriesfor Back-end testing/Database testing and participated in end-to-end testing.
  • Migrated enterprise database to Microsoft Azure with Redgate and Azure data factory.
  • Helped to develop backup and recovery strategy for databases on virtualization platform utilizing Microsoft Azure.
  • Perform thorough unit testing on all relevant platforms to ensure proper implementation.
  • Involving in developing test cases for Unit Testing.
  • Worked in an agile scrum methodology.
  • Implemented code check-in/check-out and managed multiple versions of complicated code using Bitbucket.
  • Use agile methodology and effectively took part in Scrum gatherings to deliver quality deliverables within time (extensively used Scrum Do for project Management). Sprint Planning occurs for every 2-weeks.

Environment: SQL Server Management Studio 2017/2019, Visual Studio SSDT (SSIS, SSRS), SQLCQL, Azure SQL, Snowflakes, Azure Data Factory, Azure Storage Explorer, Informatica Power Center, Azure Blob, VSTS, TFS, Power BI Desktop, Power BI Service, Mainframe, PowerShell, C# .Net, Agile, Bitbucket.

Confidential, Hopewell, NJ

SQL/ETL Developer

Responsibilities:

  • Transformed business requirements into functional and nonfunctional requirements.
  • Created complex stored procedures to perform various tasks including, but not limited to index maintenance, data profiling, metadata searches, and loading of the data mart.
  • Used T-SQL in constructing User Functions, Views, Indexes, User Profiles, Relational Database Models, Data Dictionaries, and Data Integrity.
  • Implemented a master-child package model to improve maintenance and performance.
  • Configured packages with parameters to acquire values at runtime.
  • Optimized SSIS packages utilizing parallelisms, fast load options, buffer sizes, and checkpoints.
  • Design and create SSIS Packages for database migration between OLTP and OLAP Databases and documented all source to destination mappings.
  • Designed SSIS Packages to ETL existing data into SQL Server from different environments for theSSAScubes.
  • Create batch process to load the data from different source systems and insert same data into destination servers using SSIS/SQL Agents.
  • Used SSIS packages to read data from different files and pushed same to staging database.
  • Worked on various tasks and transformations likeExecute SQL Task, Execute Package Task, and Conditional split, Script Component, Merge and Lookupwhile loading the data into Destination.
  • Extensively used SSIS Import/Export Wizard, for performing the ETL operations.
  • Development of the new batch programs usingCOBOL, DB2, JCL, and INQUIRY.
  • Scheduled the SQL Agents to run on daily and weekly basis based on the business requirement.
  • Migrated SSIS Packages to Informatica.
  • Worked on Informatica- Source Analyzer, Warehouse Designer, Mapping Designer & Mapplet, and Transformation Developer.
  • Used most of the transformations such as theSource Qualifier, Expression, Aggregator, Filter, Connected and Unconnected Lookups, Joiner, update strategy and stored procedure using Informatica.
  • Extensively usedPre-SQL and Post-SQL scriptsfor loading the data into the targets according to the requirement.
  • Developed mappings to load Fact, Dimension tables (SCD Type 1 and SCD Type 2 dimensions) and Incremental loading and unit tested the mappings.
  • Created email notification module on SQL to trigger emails incase if any error occurs during job execution process.
  • Involved in development, deployment, scheduling, and troubleshooting SSIS packages.
  • Used event handlers for error handling includes (On Pre- Execute, On Post Execute and On Error)
  • Involved in the design of Data-warehouse using Star-Schema methodology and converted data from various sources to oracle tables.
  • Used Python and Django creating graphics, XML processing, data exchange and business logic implementation.
  • Wrote and executed various MYSQL database queries from Python using Python-MySQL connector and MySQL dB package.
  • AutomatedPowerQuery refresh usingpowershell script and windows task scheduler.
  • Using a query editor inPower BIperformed certain operations like fetching data from different file.
  • Used various sources to pull data intoPower BI such as SQL Server, SAP BW, Oracle, SQL Azure etc.
  • Designed complex data intensive reports in Power BI utilizing various graph features such as gauge, funnel, line better business analysis.
  • Installed and configured Enterprise gateway and Personal gateway in Power BI service.
  • Created Workspace and content packs for business users to view the developed reports.
  • Scheduled Automatic refresh and scheduling refresh in Power BI service.
  • Wrote calculated columns, Measures queries in Power BI desktop to show good data analysis techniques.
  • Worked on all kind of reports such as Yearly, Quarterly, Monthly, and Daily.
  • Developed various Python scripts to find vulnerabilities with SQL Queries by doing SQL injection, permission checks and performance analysis and developed scripts to migrate data from proprietary database to PostgreSQL.
  • Developed Parameterized Queries for generating Tabular reports, Formatting report layout, creating reports using Global Variables, Expressions, Functions, Sorting the data, Defining Data Source and Datasets, calculating subtotals and grand totals for the reports using SSRS.
  • Developed data interface layer using ASP.NET and C# for data access.
  • Created Parameterized, Cascaded, Drill-down, Crosstab and Drill-through Reports using SSRS and Crystal Reports with .NET applications.
  • Involved in the design, development and implementation of web user interfaces using C#. NET, ASP.NET
  • UsedTemp Tablesto reduce the number of rows for joins, to aggregate data from different sources for different reports.
  • Schedule and deploy reports on Report Manager at frequencies specified in client requirements.
  • Responsible for deploying reports to Report Manager and Troubleshooting for any error occurring in execution.
  • Involved in Analyzing, designing, building &, testing of OLAP cubes with SSAS and in adding calculations using MDX.
  • Create and grant permissions on various Objects ensuring data security.
  • Create SQL logins and assign Roles and Authentication Modes as a part of Security Policies for various categories of User Support.
  • Involved in supporting production environment during scripts deployment process.
  • Created build scripts to configure CI/CD process using Visual Studio and Azure pipeline process.
  • Provide technical solutions and liaise with the Operations business during critical production outages.
  • Manage the remediation tasks following critical outages, to avoid repeat failures.
  • Work with the Development and Release Management teams to ensure robust code implementations.
  • Debug existing code and troubleshoot for issues, resolve data, system, and performance issues.
  • Created test scenarios and test cases based on functional documents and User stories.
  • Involved in Functional, Regression, Performance and User Acceptance Testing.

Environment: TSQL, SQL SERVER 2017, SSIS, SSRS, SSAS, SSMS 2017, Power BI, Informatica Power Center, Azure, Visual Studio 2017/2015, VB.Net, Facets 4.3/4,7, C#, ASP .NET, XML, Windows Servers, IIS, Agile Programming, Bit Bucket and MS Office.

Confidential, Pittsburg, PA

SQL/ETL Developer

Responsibilities:

  • Extensively involved in designing the SSIS packages to export data of flat file source to SQL Server database.
  • Involved in creating SSIS jobs to automate the reports generation, cube refresh packages.
  • Designed high level ETL architecture for overall data transfer from the OLTP to OLAP with the help of SSIS.
  • Extract Transform Load (ETL) development using SQL Server 2008 Integration Services (SSIS).
  • Created complex SSIS packages using proper control and data flow elements with error handling.
  • Installation and configuration of SQL Server 2008.
  • Involved in creating Tablix Reports, Matrix Reports, Parameterized Reports, Sub reports using SQL Server Reporting Services 2008.
  • Enhancing and deploying the SSIS Packages from development server to production server.
  • Created Drill-down, Drill-through and Sub-Report using RDL. Promoted RDLs to Reporting Service Server.
  • Developed the chart type, Tablix, matrix reports in SSRS (SQL Server Reporting Services) like enrollment status, Financial Aid, Degree Objective from the OLAP system.
  • Used performance Point Services, SSRS, excel as the reporting tools.
  • Wrote the expressions in SSRS wherever necessary.
  • Developed and maintained multiple Power BIdashboards/reports and content packs.
  • Created effective reports using visualizations such as Bar chart, Clustered Column Chart, Waterfall Chart, Gauge, Pie Chart, Tree map etc. usingPower BI.
  • Developed calculated columns and measures usingDAXinPower BI
  • Used the tabular model cubes as the source inPower BIto visualize the data.
  • Created reports with different features like drill through capabilities, jump to another report, use of pie charts, and bar charts.
  • Developed dashboard reports using Reporting Services, Report Model and ad-hoc reporting using Report Builder.
  • Involved in modifying report content and exporting reports in multiple formats based on the business requirements.
  • Designed and implemented Parameterized and cascading reports using SSRS.
  • Developed data model based on the BA's and the reporting Team using SSAS, SSRS.
  • Developed and published reports and dashboards usingPower BIand written effectiveDAXformulas and expressions.
  • UtilizedPower QueryinPower BItoPivot and Un-pivotthe data model for data cleansing and data massaging.
  • Created severaluser rolesandgroupsto the end - user and providedrow level securityto them.
  • CreatedPower BI Reportsusing the Tabular SSAS models as source data in Power BI desktop and publish reports to service.
  • Created OLAP applications with OLAP services in SQL Server and build cubes with many dimensions using both star and snowflake schemas.
  • Created derived columns on the final fact and dimension tables as per the business requirements.
  • Understanding the OLAP processing for changing and maintaining multi-dimensional data warehousing, Optimizing Dimensions, Hierarchies and adding aggregations to the OLAP Cubes.
  • Played a major role in production support of SSAS (cube refresh), SSIS jobs.

Environment: SQLServer 2008/2005 Enterprise Edition,Snowflakes,SQLBISuite (SSAS, SSIS, SSRS), Power BI, VB Script, Enterprise manager, XML, MS PowerPoint, OLAP, OLTP, MDX, Erwin, MOSS 2007, MS Project, MS Access 2008 & Windows Server 2008, Oracle.

Confidential

Software Developer

Responsibilities:

  • Actively participated in creating requirements traceability matrices, Test Scenarios, Test plans and performed inspection of the Test scripts for the application.
  • Designed efficient ETL process to transform and load data from various sources through ESB across the enterprise.
  • Created Efficient Stored Procedures, Triggers, and Functions, Indexes, Tables, Views, and other T-SQL scripts.
  • Extracted, Transformed and Loaded source data into respective target tables to build the required data marts.
  • Developed efficient SSIS packages for processing fact and dimension tables with complex transformations.
  • Responsible for Deploying, Scheduling Jobs, Alerting and Maintaining SSIS packages.
  • Implementing and managing Event Handlers, Package Configurations, Logging, System and User-defined Variables, Check Points and Expressions for SSIS Packages.
  • Developed the Efficient ways of Restorability strategies in case of any SSIS Package failures.
  • Responsible for report generation using SQL Server Reporting Services (SSRS) and Crystal Reports based on business requirements.
  • Wrote SQL queries in MS Access and Teradata for data manipulations.
  • Generated periodic reports based on the statistical analysis of the data using SQL Server Reporting Services (SSRS).
  • Migrated databases from SQL to DB2.
  • Migrated Legacy Oracle and DB2 databases to SQL Server.
  • Involved in Creating Parameterized, Cascaded, Drill-down, Crosstab and Drill-through Reports using SSRS.
  • Created Reports using SSRS and Created Parameterized Reports Cascading reports using SSRS.
  • Created Cubes from which data can be retrieved rapidly by enterprise information consumers.
  • Worked on Multidimensional Expressions (MDX) queries on Cubes.
  • Experience in Modifying, Copying Adobe files containing embedded objects using Documentum.
  • Developing .Net assemblies in C# for business logic and data access.
  • Involved in the development of the Business logic layer using C# and Data access layer using C# and ADO.NET.
  • Configured and managed data sources, data source views, cubes, dimensions, mining structures, roles and creating hierarchies with SSAS.
  • Created Cell Level Security in cubes using SSAS.
  • Expertise in creating Perspectives, Partitions and Design Aggregations in cubes using SSAS.
  • Involved in Weekly database maintenance plans and tuning the databases and applications for better performance.
  • Created and maintained Database Projects for efficient application development and deployment.

Environment: SQL Server 2005/2008/2012 , SSIS, SSRS, SSAS, Crystal Report 10, Visual Studio, C#, ADO .Net.

We'd love your feedback!