We provide IT Staff Augmentation Services!

Senior Etl Dev Ops Lead/cloud Resume

Edison, NJ

SUMMARY

  • Over 10 years of experience in designing, developing, and maintaining large business applications involving data migration, integration, conversion, and data warehousing.
  • Experience includes thorough domain knowledge of Banking, Insurance (and reinsurance), Healthcare, Pharmacy, and Telecom industries.
  • Well versed with developing and implementing Map Reduce jobs using Hadoop to work with Big Data.
  • Have experience with Spark processing Framework such as Spark and Spark Sql, Experience in NoSQL databases like HBase, MongoDB.
  • Experience in importing and exporting data using Sqoop from HDFS to Relational Database Systems (RDBMS), Teradata and vice versa.
  • Experience working with various versions of Informatica Power center - Client and Server tools
  • Business requirements review, assessment, gap identification, defining business process, deliver project roadmap including documentation, initial source data definition, mapping, detailed ETL development specifications and operations documentation.
  • Expertise in data warehousing, ETL architecture, data profiling and business analytics warehouse (BAW)
  • Designed, developed Informatica mappings, enabling the extract, transport and loading of the data into target tables and loading into the Teradata.
  • Created Workflow, Work lets and Tasks to schedule the loads Confidential required frequency using Workflow Manager and passed the data to Microsoft SharePoint.
  • Designed and developed Informatica mappings for data loads and data cleansing.
  • Created complex mappings using Aggregator, Expression, Joiner transformations
  • Involved in generating reports from Data Mart using OBIEE and working with Teradata.
  • Expertise in tuning the performance of mappings and sessions in Informatica and determining the performance bottlenecks.
  • Experience in creating pre-session and post-session scripts to ensure timely, accurate processing and ensuring balancing of job runs.
  • Experience in integration of various data sources like SQL Server, Oracle, Tera Data, Flat files, DB2 Mainframes.
  • Strong experience in complex PL/SQL packages, functions, cursors, triggers, views, materialized views, T-SQL, DTS.
  • Thorough knowledge of different OLAP’s like DOLAP, MOLAP, ROLAP, HOLAP.
  • Intense Knowledge in designing Fact & Dimension Tables, Physical & Logical data models using ERWIN 4.0 and Erwin methodologies like Forward & Reverse Engineering.
  • Experience in creating UNIX shell scripts and Perl scripts.
  • Knowledge in development of reports using Business Objects, Cognos and Micro strategy.
  • Knowledge in Installation and configuration of Informatica server with Sql server, oracle and able to handle the Informatica administrator tasks like Configuring DSN, creating Connection Strings, copying & moving mappings, workflows, creating folders etc.

TECHNICAL SKILLS

Data Warehousing: Informatica Power Center, Power Exchange for DB2, Metadata Reporter Data Profiling, Data cleansing, Star & Snowflake Schema, Fact & Dimension Tables, Physical & Logical Data Modeling, Data Stage, Erwin

Business Intelligence Tools: Business Objects, Cognos

Databases: MS SQL Server, Oracle, Sybase, Teradata, MySQL, MS-Access, DB2

Database Tools: SQL*Plus, SQL*Loader, Export/Import, TOAD, SQL Navigator, SQL Trace

Development Languages: C, C++, XML, SQL, T-SQL, PL/SQL, UNIX Shell Scripting

Other Tools and Technologies: MS Visual Source Safe, PVCS, Autosys, cron tab, Mercury Quality center

PROFESSIONAL EXPERIENCE

Confidential, Edison NJ

Senior ETL Dev Ops Lead/Cloud

Responsibilities:

  • Acted as ETL Lead for many Dev Ops projects and created Designs, Data models, created a successful integration with Flat files, Web services and Databases and integrated them in a Network and performed Dimensional modelling for Property/Causality applications and implemented interfaces connected to MDM.
  • Translated business needs into long-term architecture solutions, have exposure to Data acquisition, ETL Frameworks, highly configurable reusable code, ability to understand and master existing code
  • Defined and designs dimensional databases and worked on several RFC’s related to databases tickets
  • Developed strategies for data acquisitions, archive recovery, and implementation of a database.
  • Performed and managed data curation activities.
  • Worked exclusively in Informatica Cloud ICRT Real time and used Services and Process Designer console to create Cloud ICRT Mappings using web services and Salesforce data
  • Created Informatica Cloud ICS and ICRT Mappings to consume web services data from Microsoft Dynamic CRM and post real time services with web services
  • Used all Informatica cloud services like Process, Topics, Service connections to retrieve real time web services data
  • Reviewed and developed object and data models and the metadata repository to structure the data for better management and quicker access.
  • Analyzed current data collection techniques and curation approaches and recommend improved and more efficient approaches.
  • Evaluated opportunities to automate the data management and curation processes.
  • Have Excellent understanding of Data Architecture principles, capabilities, and best practices with the proven ability to define, document and apply standards.
  • Have Experience with business processes, data management, data flows, data relationships, data quality standards and processes, and proficiency.
  • Installed and configured Hadoop Map Reduce, HDFS, Developed few Map Reduce jobs in java for data cleaning and preprocessing.
  • Written Map Reduce code to process and parsing the data from various sources and storing parsed data into HBase and Hive using HBase-Hive Integration.
  • Worked on installing cluster, commissioning & decommissioning of Data node, Name node recovery, capacity planning, and slots configuration.
  • Worked on moving all log files generated from various sources to HDFS for further processing.
  • Developed workflows using some mid-level custom Map Reduce, Pig, Hiveand Sqoop.
  • Have Extensive experience and domain knowledge in the areas of defining and designing Master Data Management and Data Governance architecture frameworks.
  • Have Experience with Informatica data management and architecture tools, particularly those relevant to data quality, master data, and management of enterprise metadata Drove development and implementation of financial analytic tools.
  • Designed entity relationship diagrams for enterprise-wide use in government and civilian reporting, business analysis, and management decisions and worked in data modelling of very large datasets which are about greater than 60 TB
  • Built a data model encompassing executive systems, with details on current and needed interfaces.
  • Implemented the ability to export to Excel for configurable charts and graphs, with support for future-state capabilities.
  • Developed an enterprise data model for finance and timesheet management, leading to launch of an executive data warehouse capable of interfacing with all systems for executive reporting.
  • Liaised with team members on data, ETL, and technical requirements.
  • Acted as Data architect and involved in various Data cleansing, Data manipulation, Data design.
  • Ensured implementation of a more efficient, higher-quality data warehouse by developing a plan featuring OLTP, OLAP, and semantic taxonomy, leveraging deep understanding of current executive systems to build entity relationship diagrams of cross-functional databases.
  • Additionally, aided finance teams throughout data warehouse enhancement by evaluating existing reports to determine efficiencies and engineer key improvements.
  • Built a dashboard with standard reports, charts and graphs that effectively displayed data from mandated executive reports, collaborating with the CIO, stakeholders, and the architecture board as needed.
  • Acted as Informatica ETL Architect to implement and establish Informatica IDQ Architecture.
  • Extensively used Informatica Data validation option (DVO) to maintain different test scripts and validate data in different projects and also used Informatica power center data validation client to compare data sets for testing and validation
  • Extensively used Informatica Data masking to specify users with specific rules by choosing dictionary files in Informatica ILM workbench
  • Created architectural design for Oracle ERP’s and Oracle EBS Suite of products and was involved in End in End implementation of entire Oracle based projects
  • Created Data model diagrams and designs using Erwin and ER Studio and acted as Lead Informatica developer to create Mappings from Flat files and connect with Xml’s.
  • Worked as an agile scrum model being part of Data conversion team and performed data validations.
  • Created Mappings Connecting the Xml’s with Payload Db2 Databases which are in Oracle 11 and WPS tables in the Event store databases maintained in Big data
  • Created several Informatica Mappings in all the EDW Environments ran the workflows and monitored it.
  • Acted as Data Architect and Also Created designs for Audit and Controls using Teradata Metadata tables and ran the scripts using Audit and controls statistics also performed Business and System analysis and Involved in running Hadoop Big Data Streaming jobs to process Terabytes of data.
  • Installed and configured Hadoop Big data, Map Reduce, HDFS, developed multiple Map Reduce jobs in java for data cleaning and preprocessing.
  • Involved in analyzing system specifications, designing and developing test plans.
  • Worked on ingestion process of the web log data into Hadoop platform and worked in extensive data integration using Big Data
  • Used Informatica DT Studio components like Parser, Serializer etc. and created customized XML schemas which are configured using Unstructured Data transformation in Informatica.
  • Accessed Informatica DT Studio projects, created well versed DT Studio scripts which are uploaded in server for usage of modifying existing Informatica schemas using unstructured data transformation.
  • Created several Design patterns, Standards documents, and experience in ETL systems explaining all above mentioned processes, used DT Studio Data transformation studio authoring and created .tgp scripts, extracted reports using Cognos.
  • Also tested the Databases using PLSQL and analyzed several mapping scenarios have Sql Development expertise with Oracle and Sql server 2008, created High level task plan and deliverables
  • Created all the paths and folders accessible from UNIX and ran all the Informatica workflows in UNIX using cron tab and performed Data Analysis using advanced techniques

Environment: Hadoop Big Data, Informatica Cloud Products, Informatica Power Center 9.1 version, Informatica power exchange, Informatica DT, Oracle 11, BAW, Teradata 6, SQL, PL/SQL, Cognos.

Confidential, Columbus OH

Senior ETL Dev Ops Lead with Cloud

Responsibilities:

  • Acted as ETL Dev Ops Lead for many projects which uses several databases like Oracle and involved in Modeling, Estimation, Requirement Analysis and Design of mapping document and planning using ETL, BI tools, MDM, Toad by various environmental sources.
  • Wrote pig scripts to handled semi structured data as structured data and for inserting data into Hbase from HDFS.
  • Supported Hbase Architecture Design with the HadoopArchitect team to develop a Database Design in HDFS.
  • Transformed and aggregated data for analysis by implementing work flow management of Sqoop, Hive and Pig scripts.
  • Written scripts in Python with the Hadoop streaming API to maintain the data extraction.
  • Wrote UDFs in python for Hive queries for data analysis to meet the business requirements.
  • Involved in creating Hive tables and loading them with data and writing Hive queries.
  • Involved in converting Cassandra/Hive/SQL queries into Spark transformations using Spark RDDs in Scala and python.
  • Implemented Kafka Custom encoders for custom input format to load data into Kafka Partitions.
  • Worked on live Hadoop cluster running on Cloudera CDH 5x, participated in requirement analysis and creation of data solution using Hadoop.
  • Worked exclusively in Informatica Cloud ICRT Real time and used Services and Process Designer console to create Cloud ICRT Mappings using web services and Salesforce data
  • Created Informatica Cloud ICS and ICRT Mappings to consume web services data from Microsoft Dynamic CRM and post real time services with web services
  • Used all Informatica cloud services like Process, Topics, Service connections to retrieve real time web services data
  • Acted in coordinating experience in ETL systems and weblogs and planned analysis using Deliverables include ERD, Data Models, Data Flow Diagrams, Use Cases, Gap Analysis and process flow documents and have expert understanding of Ralph Kimball and Created Data model diagrams and designs using Erwin and ER Studio and implemented interfaces connected to MDM.
  • Data Architected for the redesign and migration of the clients Campaign Management system.
  • Analyze over 2000 data-related requirements to produce an Operational Data Warehouse (ODW) data model(s) using Erwin that supports the receipt and transformation of data from over 43 sources; provides the current state of the atomic data; aggregates customer performance, promotion performance and product performance to aid campaign list selection, qualitative analysis, scoring and reporting.
  • Using ERWIN, created entity relationship diagrams which modeled the Operational Data Store (ODS) portion of the data warehouse architecture and worked in data normalization of very large datasets which are greater than 65 TB
  • Developed dimensional models for the data warehouse and four data marts. The data marts included Accounts Payable, Accounts Receivable, Project Systems and Sales.
  • Developed the Data Warehousing Strategy Document, which included architecture, metadata management and methodology. Developed data mapping documents, data warehouse population documents and data mart population documents.
  • Played a key role in the Extract, Transformation and Load (ETL) architecture and design. Interfaced with business users for the development and verification of requirements.
  • Prepared database sizing estimates, tablespace and partitioning strategy, and implemented the models on an Oracle platform. Prepared SQL scripts for the production implemented by the DBA.
  • Also participated in installing Informatica data quality IDQ server and performed data manipulation
  • Worked as an agile scrum model being part of Data conversion team and performed data validations.
  • Implementation of Integrated EDI applications using Gateway protocol and Enterprise Application Integration (EAI) for business processes across applications in Oracle 10 version
  • Extensively worked with Teradata in data Extraction, Transformation and loading from source to target system using Bteq, Fast Load, and Multi Load and Tpump.
  • Worked with Teradata 14 version and involved in writing scripts for loading data to target data Warehouse have Sql Development expertise with Oracle and Sql server 2008
  • Designed and developed ELT (Extract transform & Load) solutions for Bulk transformations of client’s data coming from Mainframe Db2 and performed Data Analysis using advanced techniques
  • Used Informatica DT Studio components like Parser, Serialiser etc. and created customized XML schemas which are configured using Unstructured Data transformation in Informatica.
  • Accessed Informatica DT Studio projects, created well versed DT Studio scripts which are uploaded in server for usage of modifying existing Informatica schemas using unstructured data transformation.
  • Created several Design patterns, Standards documents, ETL strategies explaining all above mentioned processes, used DT Studio Data transformation studio authoring and created .tgp scripts.
  • Developed complex Pl/Sql procedures and packages as part of Transformation and data cleansing.
  • Developed and used Informatica MDM Components for data governance projects and data integration projects and performed Dimensional modelling extracted reports using Cognos.
  • Set up batches and also used Informatica MDM Components like Informatica data director and data controls also performed Business and System analysis created High level task plan and deliverables
  • Extensively used Informatica debugger to validate Mappings and to gain troubleshooting Information about data and error conditions and also participated in file admin tasks.
  • Installed and configured Hadoop Map Reduce, HDFS, Developed multiple Map Reduce jobs in java for data cleaning and preprocessing.
  • Written Map Reduce code to process and parsing the data from various sources and storing parsed data into HBase and Hive using HBase-Hive Integration.
  • Worked with HBase and Hive scripts to extract, transform and load the data into HBase and Hive.
  • Designed, developed Informatica mappings using Informatica 9.1, enabling the extract, transport and loading of the data into target tables and loading into the Teradata.
  • Performing the ETL operations to support the data loads and transformations using SSIS.
  • Involved in migration from Oracle to SQL Server using SSIS
  • Developed Cubes using SQL Analysis Services (SSAS).
  • Deployment of SSAS Cubes to the production server, worked in Installation of Facets software.
  • Generation of reports from the cubes by connecting to Analysis server from SSRS.
  • Experience in Developing and Extending SSAS Cubes, Dimensions and data source view SSAS-Data Mining Models and Deploying and Processing SSAS objects.
  • Performed Configuration Management to migrate Informatica mappings/sessions /workflows from Development to Test to production environment.

Environment: Hadoop Big data, Informatica Cloud Products, Erwin, Informatica Power Center 9.1 version, Informatica power exchange, SQL Server 2008/2012, Oracle 11, BAW, Teradata 6, SQL, PL/SQL, IBM AIX, UNIX Shell Scripts, Cognos, Erwin, Mercury Quality Center, MDM

Confidential, Columbus Ohio

ETL Migration lead

Responsibilities:

  • Acted as ETL Lead and Interacted with business community and gathered requirements based on changing needs. Incorporated identified factors into Informatica mappings to build the Data Mart for Property/Causality applications,
  • Acted as BI Consultant to manage and lead several project teams involved in formulating a plan for formulated data from Microsoft Dynamic CRM databases to Sql server and also expert in coding using T-Sql Scripts and worked on projects dealing with upgrade of SQL server from 2008 to 2016
  • Also acted as Sql server DBA to troubleshoot and fix many database tickets in Sql server databases.
  • Used Power Query (Included in Office 2016) to connect to theCosmosCluster and download the data do Analysis required.
  • Proactively maintain the health of the database environment, such as disk space, indexing, compute performance and replication.
  • Used a lot of SQL Server, Oracle databases & Java programing and have a strong exposure to relational databases and strong SQL skills
  • Used ETL (SSIS) to do a lot of object optimization with great usage of relation database concepts and also designed and implemented data models/objects
  • Used SVN to store code and Autosys tool for job scheduling and performed deployment and release management
  • Ran the Database Migration Assistant to upgrade the existingSQLServertoAzureSQLDatabase
  • Configured theAzureSQLServerand created the databases
  • Setup the firewall to restrict the IP addresses to connectAzureSQLdatabase
  • Designed systems to store and process data using Azure Managed Services, implemented the data transformation jobs using latest Azure technologies.
  • Written Hadoop Big Data Map Reduce code to process and parsing the data from various sources and storing parsed data into HBase and Hive using HBase-Hive Integration.
  • Developed workflows using some mid-level custom Hadoop Big Data Map Reduce, Pig, Hiveand Sqoop.
  • Import the data from different sources like Hadoop Big data HDFS/Hbase into Spark RDD.
  • Optimized Hadoop Big Data MapReduce Jobs to use Hadoop Big Data HDFS efficiently by using various compression mechanismsWorked on partitioning HIVE tables and running the scripts in parallel to reduce run-time of the scripts.
  • Collaborated with other engineers and used Azure SQL Server, SSAS (Cube), SSIS (ETL), Azure SQL Data Warehouse, Azure Data Lake, and PowerBI.
  • Worked heavily in database design and programming in MS-SQL Server, Sybase, Access, Networked Paradox etc.
  • Also have exposure to tools like Talend and Vertica to do dashboard analysis.
  • Experience working with columnar MPP databases like Vertica, Teradata, Pivotal, etc
  • Developed Complex database objects like Stored Procedures, Functions, Packages and Triggers using SQL and PL/SQL
  • Worked heavily in designing complex reports using SQL Server Reporting Services, Micro strategy & other BI tools.
  • Design and code database tables to store application’s data, Review query performance and optimize code.
  • Data modeling to visualize database structures, Program views, stored procedures, and functions.
  • Develop and maintain database and caching architecture roadmap for transactional applications.
  • Leverage global infrastructure operations resources on related operational tasks.
  • Troubleshoot database-related system problems, including performance, exceptions and other business logic failures related to the data structure.
  • Contribute to establishing database development standards, policies and procedure documentation.
  • Implement and contribute to version control systems and procedures for database objects and worked with SQL Server in AWS
  • Acted as BI Consultant in managing many onshore and offshore teams and successfully delivered many projects
  • Gave technical ideas to improve performance by analyzing data to build various SSRS reports and SSAS cubes to make proper decisions
  • Proficient in C#, ASP.net and JavaScript frameworks have Strong problem solving and analytical skills.
  • Must be able to work individually or with a multi-disciplinary team.
  • Converted existing statistical reports and processes to the standardized set of data sources and artifacts.
  • Acted as Support lead for the complete software development lifecycle, including requirements gathering, design, specifications, development, change management, testing, and deployment activities.
  • Worked closely with team members and business partners in converting business requirements to project documents.
  • Created the structure of database objects in database using T-SQL DDL.
  • Performed Data migration into Oracle databases and logged into Oracle databases to perform complex data analysis using SQL/PLSQL. Wrote complex Sql and PLSQL Queries including complex SQL Stored procedures.
  • Interacted with a variety of stakeholders and made sure that prospective data warehouse application enhancements met with business requirements, have complete life cycle Experience in Model, Design, Development of Data warehouse/ETL/BI Solutions
  • Planned, designed, developed and deployed the data warehouse applications or updates to existing applications or interfaces.
  • Designed, implemented and supported relational database models and data warehousing projects in a Microsoft environment using SQL Server 2008/2012/2014 technologies.
  • Created Code Check Lists, DBA Check lists, Installation game plans, and other required process documentation for production deployments
  • Imported and exported data within MS Excel and MS Dynamics CRM. Designed, developed and fine-tuned/optimized ETL data transformations and SSIS packages to import and export data from operational systems, vendor applications and data warehouses.
  • Have large SQL Server Analysis Services (SSAS) design and build experience including multi-tier infrastructures, partitioning, aggregations, proactive caching, actions, and role and data-level security.
  • Sourced data from a variety of applications, creating OLAP models, and generating multi-dimensional cubes using SSAS
  • Prepared deliverables such as source to target mappings and transformation rules documentation
  • Worked heavily on SAS business analytics software and created high level analytical reports
  • Provided tiered production support including limited on-call and participated in design and technical meetings.
  • Tracked and reported development progress and performance on a regular basis.
  • Have Built Performance Variance Reports to show high level Summary view of the Planning and Forecast strategies against Budgets for all the products, used Qlickview for creating custom dashboards and data visualization reports
  • Used Microsoft Azure Cloud in many of our projects and worked with many Microsoft Azure cloud features like Azure Files, Azure site recovery, Azure Express route, Azure VM Extensions, Azure portal, Azure active directory etc.
  • Working with multiple technical teams, architects, security officers, managers and business users, reviewed and documented workloads, system dependencies and business requirements. Mapped workloads to the capabilities of Microsoft Azure for public, private and hybrid clouds.
  • Designed, developed, implemented and documented data ingestion & integration processes
  • Monitored existing ETL processes and troubleshooting errors and has experience in designing and building complete ETL/SIS processes moving and transforming data from ODS, Staging, and Data Warehousing
  • Created and facilitated efficient and cost effective solutions and analyze systems specification to meet business requirements
  • Worked with Product Owners, DEV leadership, and other agile team members to evolve understanding of product needs and translated them into product specifications and working system components.
  • Designed, developed, tested, and maintained data warehouse environments, ETL’s, stored procedures, etc.
  • Utilized source control for all development work, code management, data warehouse solutions builds and deployments to production, QA, and development environments, etc.
  • Worked as Tech support for Setup and manage jobs in SQL Server Agent.
  • Troubles hooted and resolved issues in production, QA, and DEV, including performance issues, deadlocks, cube failures, job failures, etc.
  • Acted as on-call support of production data warehouse environments.
  • Performed Careful testing of new code and other changes to the data warehouse environments prior to submitting work for review by QA and/or release to production.
  • Deployed to and debugged SSIS environments (Dev, Test, Prod), Designed, built and deployed effective SSIS packages
  • Developed SQL Server applications and optimization procedures including DDL, DML, stored procedures, triggers, SSIS, and also performed complex performance tuning.
  • Worked with Latest products like Google Cloud, AWS, Azure and also worked on integration platforms like Mule soft to connect all the loop ends of data integration products
  • Tuned the database for performance gains with the assistance of the lead.
  • Used Autosys for all ETL Jobs scheduling and I was involved in nightly ETL jobs scheduling and monitoring.
  • Implemented, developed and supported Business Intelligence applications Dashboards.
  • Very strong in Structured Query Language tools including tasks like creating tables, views, stored procedures, etc.
  • Performed design and development of enterprise level multi-tier systems for a high volume transactional and analytical databases
  • Hands on experience as a SQL Server DB Developer with strong RDBMS architectural fundamentals - SQL Server
  • Heavily experienced in Data Modelling, created and incorporated star flake and snowflake schemas, Developed aConceptual modelusing Erwin based on requirements analysis and used Microsoft excel to create custom reports and analyzed them
  • Created Data Governance graph sheets, Developed normalizedLogicalandPhysicaldatabase models to design OLTP system for insurance applications
  • Created dimensional model for the reporting system by identifying required dimensions and facts usingErwin r7.1
  • Collaborated with DBAs, data architect, software developers, system architects, and quality assurance to complete business initiatives in a timely manner.
  • Maintained and improved on understanding of current Microsoft SQL Server and emerging Microsoft SQL Server technologies
  • Demonstrated experience in managing the set up and upgrades of complex databases, including interfacing with source control in order to generate creation, upgrade, and maintenance scripts
  • Researched and implemented practices in data ingestion, integration and transformation and developing ETL design patterns
  • Proficient in Microsoft SQL Server Analysis Services -SSAS (Multi-dimensional and Tabular)
  • Used Data Warehouse Architecture and Modeling for all the projects involved.
  • Used Tableau for dashboard reporting, Microsoft Reporting Services 2012-2016, or Power BI.
  • Used Comprehension of relational and dimensionalmodeling, performance tuning techniques such as indexing and analyzing query execution plans.
  • Used agile development (Scrum) has ability to take ownership and facilitate consensus with stakeholders, collaborated and used problem solving methods in an open team environment.
  • Built reusable components for a scalable ETL methodology such as error handling, logging and recoverability, did high end Pl/Sql coding on Oracle servers
  • Experience using Microsoft MVC and Visual Studio, .NET, ASP.NET, and TFS
  • Have good knowledge of client programs dealing with agencies, worked in a team environment with minimal supervision. Have self-motivation to get the work done
  • Have excellent communication skills, have experience maintaining codebases of large existing applications
  • Have advanced knowledge in the Microsoft Visual Studio, C#, ASP.NET, T-SQL, and XML. Have solid understanding of relational database design principles and techniques in SQL Server, SSRS, and SSIS
  • Have solid ideas on version control to identify changes in source, solution, configuration, and other files
  • Have familiarity with Angular, Entity Framework and ASP.NET MVC and have familiarity with Data Warehousing principles and design, ETL, SSAS, SSRS, and SAS
  • Acted as ETL Architect and created Designs, Data models, created a successful integration with Flat files, Web services and Databases and integrated them in a Network and performed Dimensional modelling for Property/Causality applications and implemented interfaces
  • Acted as BI architect in full life cycle implementation of Sql Server and performed enterprise level installation of Sql Server products and delivered BI Solutions across complete life cycle of projects
  • Acted as BI architect and involved in various Data cleansing, Data manipulation, Data design for Sql server database and created multi-dimensional cubes using SSAS, scripted some applications using Python programming.
  • Acted as lead developer and created Designs, Data models, created a successful integration with Flat files, Web services and Databases and integrated them in a Network and performed Dimensional modelling for Implementation of Microsoft Dynamic CRM, Microsoft Azure, SSIS Catalogs, Microsoft Cloud applications
  • Acted as high quality aggressive SSIS Tech lead developer and used Microsoft SSIS to create SSIS packages using T-Sql and advance coding concepts like writing stored procedures together with SSIS transformations, Stored procedures to produce SSIS packages which are high on performance
  • Acted as a Go to tech lead developer to troubleshoot any complex issues related to sql server, worked in several Data quality projects by cleansing the data as per client needs
  • Created SSIS packages to clean and load data to data warehouse from Flat files, web services and to transfer data between OLTP and OLAP databases and worked on several insurance modules
  • Created high tech SSIS packages as per the client’s functional requirements and also used innovative concepts to meet scalability, security, reliability and also worked on loading data in Big data
  • Used Microsoft SSIS to work on lot of data coming from XML Files loaded data from XML Schemas into SQL Server
  • Worked heavily on existing production SSIS packages, checked the performance, fixed slow capacity running SSIS packages
  • Good Experience in using SSIS Script task transformation to create complex transformation logic using C#, wrote complex T-sql scripts using stored procedures and created ODS and Datamarts.
  • Created SSIS Packages using Pivot Transformation, Fuzzy Lookup, Derived Columns, Condition Split, Term extraction, Aggregate, Execute SQL Task, Data Flow Task, and Execute Package Task etc. to generate underlying data for the reports and to export cleaned data from Excel Spreadsheets, Text file, MS Access and CSV files to data warehouse.
  • Created a table to save dirty records, Extracted data from XML to sql server and Created SSIS packages using C, C++ for data Importing, Cleansing, and Parsing etc. Extracted, cleaned and validated
  • Worked aggressively in Change data capture tools to compare various versions of SQL Code and worked with advanced Xmls and MQ Real time series
  • Created drill down, drill through, sub and linked reports using MS SSRS as well as managed the subscription of the reports.
  • Generated multi parameterized reports in MS SSRS allowing users the ability to make selections before executing reports; thus making them user friendly.
  • Worked on sql server change tracking tools using database properties dialog box and enabled change tracking mechanism whenever its needed to track the changes
  • Worked in creating and developing several interfaces using C#.Net, VB.Net and ASP.Net and also worked with .Net web interfaces and also scripted some applications using Perl language
  • Wrote MDX expressions to create name sets and calculated members and Created reports by dragging data from cube and wrote mdx scripts.
  • Created reports by extracting data from cube and Generated reports using SQL Server Reporting Services 2014 from OLTP and OLAP data sources.
  • Used Microsoft SSAS and created many SSAS Cubes and recommended reporting to be done by connecting DataMart’s to SSAS Cubes
  • Designed and deployed reports with Drill Down, Drill Through and Drop down menu option and Parameterized and Linked reports and created some custom dashboards using Tableau
  • Deploying and scheduling Reports using SSRS to generate all daily, weekly, monthly and quarterly Reports including current status.
  • Experience in developing Job Sequences like Job Activity, Start-loop activity, End-loop activity, Execute command activity, and user-variable activity and Notification activity.
  • Manage Analysis management objects (AMO) using BIDS, provided knowledge transfer to the company’s SQL programmers about old PeopleSoft Financials applications
  • Designed maintenance plans, wrote distributed queries to extract data from different servers.
  • Created and tuned stored procedures, triggers and tables, created several materialized views and UDF’s to be accessed by the new front end applications.
  • Created views by vertical partitions of tables, Used Execution Plan, SQL Profiler and Database Engine Tuning Advisor to optimize queries and enhance the performance of databases

Environment: SQL Server 2016, SSIS, SSRS, Cloud Technologies, Hadoop Big Data, Oracle 11i/10g, Teradata 6, SQL, PL/SQL, IBM AIX, UNIX Shell Scripts, Cognos, Erwin, STAR team, Remedy, Maestro job scheduler, Mercury Quality Center

Confidential, MI

ETL Lead

Responsibilities:

  • Acted as ETL lead and created Designs, Data models, Created a successful integration with Flat files, Web services and Databases and integrated them in a Network and performed Dimensional modelling for Property/Causality applications and implemented interfaces
  • Acted as BI lead in full life cycle implementation of Sql Server and performed enterprise level installation of Sql Server products
  • Acted as BI lead and involved in various Data cleansing, Data manipulation, Data design for Sql server database
  • Acted as Tech Lead/Manager to lead several project teams involved in formulating a plan for formulated data from Legacy systems to Sql server and also migrated some mappings to Pentaho.
  • Acted as Tech lead managed offshore teams and acted as Data Migration Consultant and managed migration of data, enforced migration of data
  • Acted as ETL Lead and created many project plans and Interacted with business community and gathered requirements based on changing needs.
  • Created SSIS packages to clean and load data to data warehouse and to transfer data between OLTP and OLAP databases.
  • Created SSIS Packages using Pivot Transformation, Fuzzy Lookup, Derived Columns, Condition Split, Term extraction, Aggregate, Execute SQL Task, Data Flow Task, and Execute Package Task etc. to generate underlying data for the reports and to export cleaned data from Excel Spreadsheets, Text file, MS Access and CSV files to data warehouse.
  • Created a table to save dirty records, Extracted data from XML to sql server and Created SSIS packages using C, C++ for data Importing, Cleansing, and Parsing etc. Extracted, cleaned and validated
  • Worked in creating and developing several interfaces using C#.Net, VB.Net and ASP.Net
  • Worked on Property & Causality apps and Scheduled Cube Processing from Staging Database Tables using SQL Server Agent, Wrote MDX queries and expressions for use in Front End BI Applications interacting with Dynamic CRM and SharePoint
  • Created KPI Objects in line with Requirement Docs, Optimized cubes for better query performance.
  • Aggregated cube with 40% performance for better performance, processed cube to perform incremental updates.
  • Wrote MDX expressions to create name sets and calculated members and Created reports by dragging data from cube and wrote mdx scripts.
  • Created reports by extracting data from cube and Generated reports using SQL Server Reporting Services 2014/2012 from OLTP and OLAP data sources.
  • Designed and deployed reports with Drill Down, Drill Through and Drop down menu option and Parameterized and Linked reports.
  • Deploying and scheduling Reports using SSRS to generate all daily, weekly, monthly and quarterly Reports including current status.
  • Prepared the Layouts by placing the fields to the appropriate place according to the requirement of the final Report and also created automation test framework using VB Script/Q.T.P.
  • Experience in developing Job Sequences like Job Activity, Start-loop activity, End-loop activity, Execute command activity, and user-variable activity and Notification activity.
  • Manage Analysis management objects (AMO) using BIDS, Provided knowledge transfer to the company’s SQL programmers.
  • Designed maintenance plans, wrote distributed queries to extract data from different servers.
  • Created and tuned stored procedures, triggers and tables, created several materialized views and UDF’s to be accessed by the new front end applications.
  • Created views by vertical partitions of tables, Used Execution Plan, SQL Profiler and Database Engine Tuning Advisor to optimize queries and enhance the performance of databases

Environment: SQL Server 2014/2012, SSIS,SSRS, Oracle 11i/10g, Teradata 6, SQL, PL/SQL,IBM AIX, UNIX Shell Scripts, Cognos, Erwin, STAR team, Remedy, Maestro job scheduler, Mercury Quality Center

Confidential

ETL Lead

Responsibilities:

  • Worked closely with business users while gathering requirements, Created Data model diagrams and designs using Erwin and ER Studio analyzing data and supporting existing reporting solutions.
  • Involved in gathering of business scope and technical requirements and created technical specifications.
  • Acted as BI lead for several databases like Sql server and involved in Modeling, Estimation, Requirement Analysis and Design of mapping document and planning using ETL, BI tools.
  • Acted as BI lead in coordinating experience in ETL systems and weblogs and planned analysis using Deliverables include ERD, Data Models, Data Flow Diagrams, Use Cases, Gap Analysis and process flow documents and have expert understanding of Ralph Kimball and Created Data model diagrams
  • Acted as BI Tech Lead Developer and managed various teams working on Facets software by Creating plans for handling SSIS packages also involved in Data migration to clean and load data to data warehouse and to transfer data between sources and targets
  • Well versed in Normalization/Demoralization techniques for optimum performance in relational and dimensional database environments.
  • Experienced in developing Facets Logical and Physical data models using data modeling tools such as Erwin, MS Visio and ER/Studio.
  • Experience with T-SQL (DDL and DML) in constructing tables, user defined functions, views, complex stored procedures, clustered and non-clustered indexes, relational database models, data dictionaries, data integrity and appropriate triggers to facilitate efficient data manipulation and consistent data storage.
  • Efficient in writing complex T-SQL queries using Joins and automated applications using Vb scripting.
  • Worked on Facets healthcare software to provide reports on Billing, Claims etc
  • Experience in creating SSIS packages and migrating DTS packages from SQL server 2005 to SQL server 2008 and used C#.Net, VB.Net and ASP.Net to develop several UI Interfaces
  • Extensive ETL experience using DTS/SSIS for data extraction, transformation and loading from OLTP systems to ODS and OLAP systems.
  • Experienced in implementing Slowly Changing Dimension wizard and Check Sum Transformation to load data into slowly changing dimensions in data warehouse.
  • Experience in providing Logging, Error handling by using Event Handler and Custom Logging for SSIS Packages dealing with Dynamic CRM modules
  • Expert in scheduling jobs, SSIS Packages using C, C++ and sending Alerts using SQL Mail.

Environment: SQL Server Products, Oracle 10g/9i, SQL, SQL Developer, Windows 2008 R2/7, Toad

Confidential, Seattle WA

ETL Developer/BODS

Responsibilities:

  • Interacted with Business Users and Managers in gathering business requirements.
  • Configured the Data services to connect to the SAP R/3.
  • Developed jobs to load the SAP R/3 tables into staging area and developed transformations that apply the business rules given by the client and loaded the data into the target database.
  • Based on the requirements, extracted the data from IBM Main frames by creating data stores using ODBC connection.
  • Participated in meeting with SAP functional team and Legacy team for getting the mapping to load the data into Customer Master IDOC (DEBMAS06) also assessed BI and Reporting requirements.
  • Used the query transform and English North American data cleansing, USA noncertified address cleansing, Match transform (to cleanse the Firm data) and Associate transform to generate Customer Master Basic data and conducted data analysis.
  • Used the case, validation, merge transforms and lookup function to validate the customer master basic data (to find whether that record already exist in SAP or not) experience profiling and analyzing data from source systems and also worked on DB2 and Mainframes.
  • Defined ETL Jobs with Data Integrator for loading data from Source to Target.
  • Installed and configured Salesforce.com Adapter and used the same to get data and update the target based on user requirements.
  • Documented the interfaces developed and provided basic training to the users.
  • Created a BO Data Services which gives the details of the transformations and the sample scenarios in which they are used.
  • Created and maintained BO Universes from BW queries and info cubes.
  • Developed Xcelsius Dashboard integrated with QAAWS and Live Office.
  • Created Web I reports with Complex Calculations to get Alerts for Xcelsius Dashboards.
  • Worked in creating Variables, Prompts in web I.
  • Worked on Hierarchy Levels and developed hierarchies to support drill down reports.
  • Created inventory accuracy Dash Boards (Xcelsius) and for management reports for inventory management and wages and costs report from cost center accounting.
  • Created advance analytics like speedometers, interactive metric trends to gauge the performance of the key business.

Environment: SAP BO Data Services XI 3.2, SAP BI 7.0, SAP ECC 6.0, MS SQL Server, and SAP CRM, Business objects / BW, SQL Server 2005/2000, Oracle 11i/10g, Teradata 6, SQL, PL/SQL

Confidential

ETL Developer

Responsibilities:

  • Based on the requirements created Functional design documents and Technical design specification documents for ETL Process.
  • Designed, developed Informatica mappings, enabling the extract, transport and loading of the data into target tables.
  • Created Workflow, Worklets and Tasks to schedule the loads Confidential required frequency using Workflow Manager.
  • Designed and developed Informatica mappings for data loads and data cleansing.
  • Created complex mappings using Aggregator, Expression, Joiner transformations.
  • Designed and developed process to handle high volumes of data and high volumes of data loading in a given load intervals.
  • Involved in testing of Stored Procedures and Functions, Unit and Integrating testing of Informatica Sessions, Batches and the Target Data.
  • Designed and developed table structures, stored procedures, and functions to implement business rules.
  • Designed and implemented mappings using SCD and CDC methodologies.
  • Designed and developed process to handle high volumes of data and large volumes of data loading in a given load window.
  • Extensively involved in migration of ETL environment, Informatica, Database objects.
  • Involved in splitting of Enter price data warehouse environment and Informatica environment in to 3 of each company.
  • Involved in jobs scheduling, monitoring and production support in a 24/7 environment.
  • Used the case, validation, merge transforms and lookup function to validate the customer master basic data (to find whether that record already exist in SAP or not) experience profiling and analyzing data from source systems and also worked on DB2 and Mainframes.
  • Defined ETL Jobs with Data Integrator for loading data from Source to Target and Data Integration with SAP BW Systems
  • Documented the interfaces developed and provided basic training to the users.
  • Created a Business Objects Data integrator jobs which gives the details of the transformations and the sample scenarios in which they are used performed business logics to the data pulled from SAP Hana systems
  • Created and maintained BO Universes from BW queries and info cubes.
  • Developed Xcelsius Dashboard integrated with QAAWS and Live Office.
  • Created Web I reports with Complex Calculations to get Alerts for Xcelsius Dashboards.
  • Worked in creating Variables, Prompts in web I.
  • Worked on Hierarchy Levels and developed hierarchies to support drill down reports.
  • Created inventory accuracy Dash Boards (Xcelsius) and for management reports

Environment: Informatica Power Center, Business objects data integrator, Business objects, MS Sql server 2008, Oracle 9i/8i, Trillium, HP UNIX, Erwin 4.2, PL/SQL.

Hire Now