Dba/data Services Developer Resume
2.00/5 (Submit Your Rating)
Arlington, Va
Objective:
To obtain a position as a Data Warehousing / OLTP Database programmer/Architect for a results-oriented company who seeks an ambitious and career conscious person, in which acquired skills will be utilized toward continued growth and advancement.
Sr. Programmer Analyst / Technical Lead / Architect / Data Modeler /DBA
- Experienced Programmer and lead/architect for medium to large size projects with in-depth understanding in various phases of the project with excellent communication skills.
- SQL server programmer with in depth knowledge in data base programming in the areas of script writing, stored procedures, complicated views, triggers and performance tuning
- Oracle programmer for all database programming requirements
- In depth relational database design fundamentals
- In depth Dimensional modeling experience
- In depth relational data modeler
- In depth Crystal reports design
- Attention to details in documenting the processes
- Vast data conversion and ETL experience using SSIS, DTS, Data Integrator, Data Junction and scripts.
- Excellent Business Objects Universe developer and WEBI reports developer.
- Proficient Business Objects ETL Data Services developer
- Good dimensional modeling experience
- 5+ years of hands on experience using OLAP tools
- Awarded several times for focused customer service by customers
- Very good analysis skills in both functional Business and technical areas
Operating Systems: Windows 95/98/2000/NT/XP
Database Programming: SQL server 2000/2005/2008 , Oracle 9.x
Languages: VB.NET, Active X scripting
Technologies: Data ware housing, Application development
ETL experience: SSIS, SAP Data Services, SQL server DTS, Data Junction
Ware house Meta Data
Modeling: Cognos Impromptu, Business Objects 4.0 IDT and Universes
Report generation Tools: Cognos Impromptu, Cognos Power Play, and Crystal 9.0, 10.0 & 11.0, Business Objects WEBI, SSRS
Dimensional Modeling: Erwin, Cognos Power Play Transformer, SSAS
Professional Experience
Confidential
DBA/Data Services Developer
- Advertising specialty primarily focuses on promotional product suppliers and distributors in association subscription model and provide services for publishing the products, virtualization of samples, orders creation, inventory fulfillment services and other associated activities
- I have been working in the roles of production support DBA, SQL server 2012 clustered high availability servers setup along with automated restores from other environments, and reporting using a home grown solution on the billions of rows collected for the activities of user on various portals.
- The production support role involves innumerous data movements across 20 servers in each environment using SQL server jobs, stored procedures and in-house built metadata management with alerts. One would have to manage the failures with in a stipulated time frame of all the data movement every day.
- I have dealt with TB of relational and web usage stats information for providing reporting services dealing with XML data processing and performance tuning of the procedures that deal with the data processing. I have come up with the idea for real time analytics with some changes to the web applications so that enormous code base maintenance and the issues thereof could be eliminated.
- I took the lead in identifying the performance bottle necks in reporting structures and partitioning very large tables in phases with automated deployments to various environments. We are also contemplating and deliberating to use column store indexes for the preparation of which reporting servers are migrated to 2014.
ETL Architect
- A short project focusing on Data Synchronization from Enterprise non-relational distributed Big Data ERP system called Momentum on Oracle 11g backend to local In-house reporting database for the identified tables on Oracle 11G for financial reporting using Business Objects Data services as ETL tool. The following are the main activities accomplished:
- Identifying the Change Data Capture mechanisms and statistically validating for the truthful synchronization of changed data using time based changed data identification.
- Momentum system is ORM service oriented architecture without any relationships present at the database level, thus, modeled the reporting database with inherent relationships for the identified reporting schema.
- Identified the Data Integrity validation mechanisms and synchronization validation mechanisms.
- Created ETL jobs using SAP Data services for data replication.
- Identified and presented means for truthful replication as the data provision is done through segmentation for constituent components in the Enterprise system and as the segmentation layer issues could prevent the reporting scheme to be unstable relationally.
- Worked as a DBA in creating the consolidated script for reporting schema creation, table relationships establishment and indexing for performance improvements.
Sr. Database Engineer/ETL Architect
Confidential as a Senior Database Engineer and embarked on standardizing data conversion from legacy systems to Case Management System called AiCMS
- Accurate Functional and Technical analysis for formulating a strategy for improving standardization and accuracy, with metrics on exceptions and easy problem identification of the errors.
- Analyzed and created a staging database with proper relationships across business collections for validating the incoming data before loading a third normal form physical database.
- Completed ETL using SSIS for loading the validated staging data into AiCMS complementing row level errors and auditing for cross referencing surrogate keys created for each business area of a row in the collection.
- Documentation is derived consisting of data flow, meta data for each data attribute collected in staging for defining templates of business collections and for assumptions to complement any TBD points for each version of the ETL.
DBA/Developer
- Joined as SQL server DBA/Developer for aiding them support enormous client base for multiple applications including an Enterprise Assignment Management System.
- Easily understanding the business and discharge both DBA and complex development activities involving SSIS integration with the enterprise system. Day to day activities include database backups/restore for multiple clients, profiles generation for Enterprise application, Performance tuning of very complex procedures and provide database security. Very complex scripts generation for configuring a web based calculation engine.
- Able to support multiple projects and yet produce desired productivity in all the required areas.
- Guided an offshore team in building an analytical data processing application using SSIS where by the deployment is automated, used SQL server agent for scheduling and maintained the file system deployment.
- Played role of Application DBA for improving the performance of various routines in a transactional systems of 50GB databases or more. Fine tuned many queries and also brought forth accuracy and auditability of the data import and exports.
SSIS Developer/ Integration Specialis
- Wrote a single query using "for XML path" clause for generating the entire hierarchical product catalog information as an XML output for interfacing with Certona recommendations system. This query is slated to be enhanced as a web service for any company to interface with Chico’s for their products data set consumption.
- Done extensive analysis on data mapping between E-Commerce ATG system and Epicor store system. Programmed a data extraction routine that acts as data extraction API and let the consuming system manage the extraction by date range, single invoice, multiple invoices and also provides regeneration and purging capabilities for holding the data extraction schema appropriate. It is very efficient in processing the data for consumption.
- Created SSAS cube with primary retail dimensions for understanding the data patterns and variations with respect to changes in the dimensional attributes. This has resulted in understanding the discount and other decisions that drove the changes in sales apart from other fixed contributing factors.
Data warehouse Developer/ Dimensional modeler
- Started working as Level C SQL server developer in services division of Accenture. Deputed as an dimensional modeler/ ETL developer for working on Earned Value Management system of the EVCE department which is a dimensional system accepting data from various sources, validate, and provide input to other systems for analysis.
- Brought forth the dimensional modeling requirements used Visio and Erwin for logical modeling and process flow schematics. Kimball methodology is used for dimensional modeling for this migration project to incorporate different grain that source financial system adopted due to changes requested by external agencies.
- Wrote design to automate/architect the configurability of ETL for deploying the solution to multiple clients. Handled changes for the ETL focusing primarily in easy migration of the existing customers to this new version there by achieving standardization in the process.
- ETL is primarily based on loading excel spread sheets received from multiple non homogeneous sources. Connection manager is set at run time to make the staging loading for multiple clients dynamic. Integration with Project Server is accomplished by way of using web service task. Removed hard coding existing in many packages by way of altering the setups and/or modular views/procedures/functions. All the dimension loads and fact loads auditing is modularized and standardized to improve the maintainability of the solution. Wrote a dynamic stored procedure using MERGE join that automate the dimensional loading by way of storing the Meta data in a table for all the dimensional loads and called from various packages with proper parameter values. Started for the first time maintenance of source code using TFS. Implemented ticket generation mechanisms for any production issue. Always resolved the tickets with in 2 hrs. Mentored developers with other back grounds to make them productive in SSIS development.
- Have been involved in analysis and development of ETL requirements for a separate effort by the company to stream line the services in this area.
- Generated work flow and ETL process flow diagrams Streamlined the documentation requirements and generated multiple design documents. Working presently on data migration to this version from standardized version using scripting/tools such as TOAD for SQL Server.
- A program for automating and instrument the package execution is found in Codeplex (open source) which is being modified to be integrated with Share point.
- Analyzed the cost feeds and implemented validation routines for identifying the problems in the feed for cleansing the data.
- Several times managers have awarded performance points for streamlining the business logic for a very loosely defined business definition.
- Have been involved in the analysis and modifications of SSRS reports. Created a report model as an alternative for deriving adhoc reports. Many a reporting requirements are complex the way they are derived, hence, logic is sent to procedure for resolving it.
- Created standard SSAS cube for exploration of the data after loading for testing and maintaining the accuracy before the data is sent to receiving systems via Report
Sr. Technical Consultant
Worked as a tech lead/architect for one of a very complex migration project from old application to a new thin client application developed by the company.
Key Contributions:
- Rrote designs for extending the application using API. Design methodology is changed to improve productivity by adopting pseudo code mechanisms. Modeled the database extensions required for adopting customizations
- Good understanding in object oriented methodology and thin client server applications.
- Improved functionality of designs for rapid application development where the iterations on each extension are minimized heavily.
- Project is completed with in 12 months that is considered to be the best schedule possible given the data anomalies and the number of extensions required.
- Managed data migration from a heavily de-normalized data structure into a different normalized structure with minimal data inconsistencies.
- Supported multiple projects in the areas of functional analysis, technical analysis, data conversion analysis, configuration management and involved in the development in parallel.
- Developed batch job processes using in house BJS ( Batch Job Submitter) application environment
- Used SSIS for developing an application for de-duping addresses using Fuzzy Grouping. Interfaced with BJS using SSIS API.
- Used Data Integrator for data migration from old data structures to new data structures after having been cleansed externally for data mining requirements and adherence to migration standards.
- Migrated Cognos Impromptu reports into Business Objects Universe changes and WEBI reports along with security implementation using WEBI for one of the client.
- Data integrator, scripts and SSIS all have been used for migrating data from old version of the system to new version and also for populating membership mart.
Sr. Consultant
Confidential ETL automation project for about 4 months where designs are written for automating a manual hand coded ETL into SQL server DTS package. Extensive analysis is done and loading is segregated into five stages for all intermediate and final stages. Packages were developed and testing was completed for the ETL.
Confidential, Vienna, VA
Sr. Reports Analyst / Sr. Reports Developer / Dimensional modeler
Working as a reports analyst and Technical Consultant for customers spanning the entire USA brought many happy faces on board by analyzing the reporting requirements with the user community and designed good looking useful reports.
Key Contributions:
- Understand the user requirements thoroughly and document the requirements
- Design the reports using Cognos Impromptu & Power play, Crystal or Business Objects WEBI.
- Train the customers in the usage of the reports
- Create catalogs for reporting requirements and work with the IT team for maintaining the catalogs for our earlier application version using Cognos Impromptu.
- Create/modify Business Objects universes for new suite of application and train customers.
- Crystal has been used for the past 4 years primarily for many reporting requirements in various parts of the application such as running reports through ODBC with parameters passed from the screen or embedded in stand alone batch job processes or stand alone crystal reports run from a reporting application.
- Participate in functional analysis of data mart requirements, create/modify physical model, create/change ETL programs using BO Data Integrator, create / modify universes, testing the data mart results with transactional data base using summarized transactional views.
- Work on data migrations involving transactional replication by writing dynamic stored procedures for merging very complex heterogeneous data into our Enterprise Assignment Management system.
- Conducted analysis and modified the dimensional models for customizations to be incorporated for the clients. Used Erwin for logical dimensional modeling for aiding in gap analysis and ETL modification requirements.
- With very strong communication and data base programming skills, brought some IT teams to speed for smooth handing over of the projects.
- Modify the change data capture as per the customizations for the membership mart for one of the client.
- Analyze and decipher the best dimensional structure based on the querying capabilities of end tools like Web Intelligence. For example, a string of demographic codes are stored for customer in the dimension where the query from web intelligence parses if the customer has the subset of that string as amalgamation his/her demographic codes by start and end dates. This string is programmed from SQL server native function to be loaded on fly giving an immense productivity.
- Tune the performance of the adhoc reports generated using various methods such as rebuilding indexes, create new indexes, loading intermediate tables and indexing them, using hash tables as intermediate tables in stored procedures, using new relational operators such as Cross Apply and MERGE where ever it can improve performance etc.,