We provide IT Staff Augmentation Services!

Solution Architect Resume

5.00/5 (Submit Your Rating)

TX

SUMMARY

  • 12+ years overall IT experience in software architecting, leading and development. In - depth knowledge to Microsoft Products like MS SQL Server 7.0, 2000, 05, 08 and 2012( Denali). SSRS, SSIS, Data DB2, Data Stage, Oracle, PDW, T-SQL, PL SQL, MongoDB, Cassandra, Hadoop, Hive, Pig, Sqoope, Flume, Kafka, Impala, Scala, Informix, ERWIN, Sybase.
  • Good Knowledge of Data Marts, MDM, Data warehousing, Operational Data Source(ODS), OLAP, OLTP, Data Modeling like Multi-Dimensional Data Modeling, Star Schema, Snow Flake Modeling, Slowly Changing Dimensions (SCD), FACT and Dimension Tables using Analysis Services.
  • Residential Legal Status: US Citizen.
  • Strengths include Data Architecture, database administration, production support, Performance Monitoring and Tuning.
  • Expertise in Database Optimization, Role mining and Entitlement modeling.
  • Expertise in Conceptual and Conceptual, Logical, And Physical Data Modeling skills, has experience with JAD sessions for requirements gathering, creating data mapping documents, writing functional specifications.
  • Experienced in optimizing performance in relational and dimensional database environments by making proper use of Indexes and Partitioning techniques.
  • Three years’ experience installing, configuring, testing Hadoop ecosystem components.
  • Experience in importing and exporting the data using Sqoop from HDFS to Relational Database systems/mainframe and vice-versa. Migrated mainframe data to Hadoop and provided solution for several integration gaps and roadmap for huge migration.
  • 3 years’ experience as AWS Solution Architect/Consultant for new development and on - premise migration to AWS/Azure.
  • Worked as Lead Solution Architect / Enterprise Architect with enterprise companies and Cloud Environment like Sales Force, AWS.
  • Four years’ experience on SAP Enterprise Application Architecture. Providing roadmap and blue print about Current state and target state.
  • Expertise in Master Data Management(MDM) concepts, Methodologies and ability to apply this knowledge in building MDM solutions. Expertise in Master Data Management concepts, Methodologies and ability to apply this knowledge in building MDM solutions.
  • Good experience in Windows Azure Service Bus, and Windows Azure Document Database.
  • Expert in Business Intelligence Applications using Cognos Tools such asCognos10.1 & 8.x BI(Report Studio, Analysis Studio, Query studio).
  • Experienced in WAMP (Windows, Apache, MYSQL, and Python) and LAMP (Linux, Apache, MySQL, and Python) Architecture.
  • Written various MapReduce programs in Java for data extraction, transformation and aggregation from multiple file formats including XML, JSON, CSV and other compressed file formats and store the refined data in partitioned tables in the EDW.
  • Experienced in software architecting, leading and development. In-depth knowledge to SAP Products like SAP HANA, BO/ BI, BODS, Xcelsius dashboards, Web Intelligence, Dashboard Designer, Crystal Reports, SLT, SAP BW (BEX), ECC, SAP Design Studio, Data modeling, Tableau, VISIO, Oracle.
  • Exposed to all phases of software development life cycle (SDLC) including in-depth knowledge of agile methodology, enterprise reporting life cycle, SQL server migrations, change control rules, problem management and escalation procedures.
  • Involved in design, enhancement, and development of applications for OLTP, OLAP, and DSS using dimension modelling with Teradata and Oracle
  • Exclusive knowledge in creating and integrating MicroStrategy reports and objects (Attributes, Filters, Metrics, Facts, Prompts, Templates, Consolidation and Custom Groups) with the data warehouse.
  • Area of technical expertise also includes ADO.NET, ODBC, Web Services, and XML & XSLT.
  • Experience in various extracting, transforming and loading (ETL) data flows using SSIS with complex integration of SQL with oracle, flat file sources and IBM mainframe DB2
  • Advanced extensible reporting skills using Crystal Reports and SQL server reporting services (SSRS). Created & formatted cross-tab, conditional, drill-down, top n, summary, form, OLAP, sub reports, Adhoc reports, parameterized reports, interactive reports & custom reports
  • Excellent in Migration of SQL Server 2000 to SQL Server 2005 & SQL Server 2005 to SQL Server 2008, Mainframe to Hadoop.
  • Vast exposure to the full software life cycle phases (system analysis, design, coding, testing and maintenance.
  • Perseverant, meticulous and a committed team player.

TECHNICAL SKILLS

Languages: .NET 1.1/2.0/3.0/3.5, C#.NET, VB.NET, C, C++, JAVA

Big Data Ecosystems: Hadoop, MapReduce, HDFS, HBase, Zookeeper, Hive, Pig, SqoopCassandra, Oozie, Flume and Talend

GUI: Sharepoint, MOSS 2007, WSS 3.0,.NET, (Crystal ReportsActive X, ADO.NET)Visual Basic 6.0, T-SQL, Visual C++ 6.0, Siebel, PowerBuilder, NUnit

RDBMS: MS SQL Server 2000, 05, 08 and 2012(Denali). SSRS, SSIS, Oracle.

DBMS: MS Access, AS 400.

NO SQL Databases: Cassandra, Mongo DB, Hbase.

Middleware Technologies: COM, COM+, DCOM, ATL, LDAP, Active Directory, MSMQ

Internet Technologies: IIS 6.0, 7.0, HTML, DHTML, Active Server Pages, ASP.NET 3.5, WCF, WPF, Sileverlight, IIS, MTS, VS 2008, Personal web server, XML, Web sphere, MS Site Server 3.0, MS Commerce Server 2000

Tools: ERWIN, Rational Tools, Informix, FTP, EDI. Photo Shop, Visio, Remedy, iSeries,SQL Reporting Services, SQL Navigator, Rapid SQL and TOADPVCS and Visual Source Safe

Operating Systems: WINDOWS NT 4.0 / 95 / 98 / 2000, MS DOS, UNIX, XP

Back Office Products: MS Word, LOTUS, and Power PointEDUCATION / TRAINING / CERTIFICATIONS

PROFESSIONAL EXPERIENCE

Confidential, TX

Solution Architect

Responsibilities:

  • Gathering business requirement and Analyze and design and develop this huge project.
  • Provided SQL Server database related technical insight on this project for migrating data warehouse.
  • Analyze differences between old and new system change requirements, create logical and physical database design based on required changes, and create/update technical design documents and production support system operations and maintenance guide responsible for estimation of work effort for project.
  • Worked on installation and configuration of core Informatica MDM Hub components such as Hub Console, Hub Store, Hub Server, Cleanse Match Server and Cleanse Adapter in Windows.
  • Developed complex Multi load and Fast Load scripts for loading data into Teradata tables from legacy systems.
  • Experienced in implementing unified data platform to get data from different data sources using Apache Kafka brokers, cluster, Java producers and Consumers.
  • Designed, developed, debug, tested and promoted Java/ETL code into various environments from DEV through to PROD.
  • Implemented Frameworks using java and python to automate the ingestion flow.
  • Worked with Hadoop Data Lakes and Hadoop Big Data ecosystem using Hortonworks Hadoop distribution, and Hadoop Spark, Hive, Kerberos, Avro, Spark Streaming, Spark MLlib, and Hadoop Distributed File System (HDFS).
  • Actively involved in coding using Core Java and collection API's such as Lists, Sets and Maps.
  • Experience on Java Multi-Threading, Collection, Interfaces, Synchronization, and Exception Handling.
  • Implemented DevOps to reduce the gaps between development and Operations. Used different tools like Source control, build management, Configuration management, monitoring and log management.
  • Migrated mainframe legacy data into Hadoop and resolved issues with ASCII format issues. We installed software’s in mainframe to load data to Hadoop.
  • Worked on analyzing Hadoop cluster and different big data analytic tools including Pig,Hive,Sqoop, spark, AWS, Cloudera, Teradata, java transformation of data into xml.
  • Developed the Enterprise Java Beans (Stateless Session beans) to handle different transactions such as online funds transfer, bill payments to the service providers.
  • We successfully migrated data to Hadoop as per project timeframe. Configured Spark streaming to receive real time data from Kafka and store the stream data to Hadoop Distributed File System (HDFS).
  • Designed and developed NoSQL (Cassandra) solutions for all users.
  • Designed the technical specifications document for Teradata ETL processing of data into master data ware house and strategized the integration test plan and implementation.
  • Worked on integrating MicroStrategy reports and objects (Attributes, Filters, Metrics, Facts, Prompts, Templates, Consolidation and Custom Groups) with the data warehouse.
  • Configured and installed Informatica MDM Hub server, cleanse Server, resource kit in Development, QA, Pre-Prod and Prod Environments.
  • Worked on java transformation of reading a hive table and converting those into xml file for CCP.
  • Developed the Logical and physical data model and designed the data flow from source systems to Teradata tables and then to the Target system.
  • Provided solution for Data Governance, Data Cleanup, Data profiling, Information Delivery Process Improvement of EDW.
  • Used Apache Spark & Spark Streaming to move data from servers to Hadoop Distributed File System (HDFS)
  • Evaluated system performance and validated NoSQL solutions.
  • Designed Data Entitlement object model for InfoSphere Master Data management data warehouse.
  • Reviewed and optimized SSIS packages, SSRS reports and, NET application. Provided document with list of improvements and implemented successfully on Production.
  • Worked on Dashboard & Scorecard Reports for Corporate Financial Forecasting as per user requirements using XSL, HTML, and DHTML.
  • Performed performance tuning for Spark Steaming e.g. setting right Batch Interval time, correct level of Parallelism, selection of correct Serialization & memory tuning.
  • Worked on performance of the SSRS reports and deployed in production environment.
  • Expertise in design and configuration of landing tables, staging tables, base objects, hierarchies, foreign-key relationships, lookups, query groups, queries/custom queries and packages

Environment: MicroStrategy, DB2, Big Data, Hadoop, Sqoop, Flume, Scala, Spark, MapReduce, AWS, UNIX, SQL Server 2008, Cassandra, DevOps, Chef, Puppet, Ansible, MDM, Agile, ERWIN, Netezza, VISIO, Autosys, Python, Datastage 8.5/8.1, Cassandra, MongoDB, SQL Assistant, Teradata12, MySQL, SSAS, SSIS, SSRS, BIDS, Microstrategy, Tableau.

Confidential, TX

Lead Database Admin/Big Data Architect

Responsibilities:

  • Reviewed all LAPS and MAPS Capabilities and provided consolidated solution to build AMPS.
  • Expertise in building Operational Data Store (ODS), Data Marts, and Decision Support Systems (DSS) using Multidimensional Model (Kimball and Inmon), Star and Snowflake schema design.
  • Designed and developed ETL processes using DataStage designer to load data from Oracle, T-SQL, Flat Files (Fixed Width) and XML files to staging database and from staging to the target Data Warehouse database.
  • Implemented business logic by writing Pig UDF's in Java and used various UDFs from Piggybanks and other sources.
  • Developed multiple Map Reduce jobs in java for data cleaning.
  • Worked on and designed Big Data analytics platform for processing customer interface preferences and comments using Java, Hadoop, Hive and Pig, Cloudera.
  • Used Oozie workflow engine to manage interdependent Hadoop jobs and to automate several types of Hadoop jobs such as Java map reduce Hive, Pig, and Sqoop.
  • Implemented Frameworks using java and python to automate the ingestion flow.
  • Experienced with accessing Hive tables to perform analytics from java applications using JDBC.
  • Used Spark Streaming with Kafka & Hadoop Distributed File System (HDFS) & MongoDB to build a continuous ETL pipeline for real time data analytics.
  • Created the Conceptual Model for the Data Warehouse using Erwin Data Modeling tool.
  • Configuration and administration of Fiber Adapter’s and handling SAN disks on AIX.
  • Developed MapReduce programs to parse the raw data, populate staging tables and store the refined data in partitioned tables in the EDW.
  • Installed and configured Hadoop Map Reduce, HDFS, Developed multiple MapReduce jobs in java for data cleaning and preprocessing.
  • Led DevOps development of PowerShell command scripts, custom PowerShell C# cmdlets, Python, Puppet and Chef to configure, verify and monitor: networks, security, routes, IP-Forwarding, VMs and cloud service roles.
  • Imported unstructured data into Hadoop Distributed File System (HDFS) with Spark Streaming & Kafka.
  • We migrated DB2 data and Mainframe data to Hadoop by using advantage of HDFS, Hive and MapReduce components to process huge legacy data batch workloads. Batch jobs are taken off from mainframe system by using Pig, Hive and MapReduce.
  • Extensively used Metadata & Data Dictionary Management; Data Profiling; Data Mapping.
  • Provided architectural artifacts context view data flow view and Data architecture view.
  • Documented all the current state and target state environments and provided blue print.
  • Setup meetings with business, technology, infrastructure and database and gave demos about our solution.
  • Suggested latest upgrades and technologies for NoSQL databases
  • Worked on Data mining seeks trends within the data, which may be used for later analysis. It is, therefore, capable of providing new insights into the data, which are independent of preconceptions.
  • Configured and installed Informatica MDM Hub server, cleanse, resource kit, and Address Doctor.
  • Involved in a data migration process between different versions of Cognos
  • Created complex custom groups using hierarchy to overcome the restriction of MicroStrategy where total on columns and average on rows cannot be calculated at the same time.
  • Designed and developed ETL process for migrating LAMPS database to new AMPS database using SSIS and informatica.
  • Build new SSRS reporting tool for business users to help customers.
  • Created Datastage jobs using different stages like Transformer, Aggregator, Sort, Join, Merge, Lookup, Data Set, Funnel, Remove Duplicates, Copy, Modify, Filter, Change Data Capture, Change Apply, Sample, Surrogate Key, Column Generator, Row Generator, Etc.
  • Involved with Informatica team members in Designing, document and configure the Informatica MDM Hub to support loading, cleansing, matching, merging, and publication of MDM data.
  • Efficiently in incorporated various data sources such as Oracle, MS T-SQL, SQL Server, and DB2, Sybase, XML and Flat files into the staging area.
  • Worked on java transformation of reading a hive table and converting those into xml file for CCP.
  • Worked on Governance process and attend AAC meetings for approval.
  • Worked with development team and providing technical specifications.
  • Attended agile calls and answering the questions and providing solution for performance issues for current state applications.
  • Worked on and designed Big Data analytics platform for processing customer interface preferences and comments using Java, Hadoop, Hive and Pig, Cloudera.
  • Implemented Kafka Java producers, create custom partitions, configured brokers and implemented High level consumers to implement data platform.

Environment: VS 2005, 2008 and 2012. MicroStrategy, Big Data, Hadoop ecosystem, AWS, Azure SQL, T-SQL, UNIX, SQL Server 2008, Cassandra, Netezza, DevOps, Chef, Puppet, Ansible, MongoDB, Cassandra, MDM, Agile, Python, Datastage 8.5/8.1, Teradata12, MySQL, SSAS, SSIS, SSRS, BIDS, Crystal Reports.

Confidential

Big Data Architect

Responsibilities:

  • Gathering business requirement and Analyze and design and develop this FRD First data project.
  • Developed testing environment for FDR data migration. Researched all the database object for this effort and Created snapshots from source and tested with local schema.
  • Converted all the SSIS packages into stored procedures as per business need.
  • Extensively used Cognos Framework Manager, Query Studio and Report Studio for model design, ad-hoc reporting and advanced report authoring.
  • Design and developed SAS reports for financial applications.
  • Worked on Designing, Developing, Documenting, Testing of ETL jobs and mappings in Server and Parallel jobs using Data Stage to populate tables in Data Warehouse and Data marts.
  • Prepared testing scripts for FDR data from source to target. Worked with other team members and coordinated the development.
  • Integrated Oozie with the rest of the Hadoop stack supporting several types of Hadoop jobs out of the box (such as Map-Reduce, Pig, Hive, and Sqoop) as well as system specific jobs (such as Java programs and shell scripts).
  • Migrated huge data from legacy mainframe data to Hadoop by using Sqoop and Flume components of Hadoop helped us to move data between Hadoop and RDBMS.
  • Facilitated JAD sessions to conducted Logical Data Modeling (LDM) and Physical Data Modeling (PDM) reviews with Data SMEs.
  • Creating AWS EC2 instances (Linux and Windows), RDS Instances, ELB (Elastic Load Balancer)
  • using AWS SDK. Built continuous Spark streaming ETL pipeline with Spark, Kafka, Scala, Hadoop Distributed File System (HDFS).
  • Tableau server. Developed Tableau visualizations and dashboards using Tableau Desktop.
  • Installed and configured Apache Hadoop, Hive and Pig environment on the prototype server. Configured SQL database to store Hive metadata and loaded unstructured data into Hadoop File System (HDFS) Enabled speedy reviews and first mover advantages by using Oozie to automate data loading into the Hadoop Distributed File System and PIG to pre-process the data.
  • Fixed issues in SAS data analytical and advanced analytical reports and turn the SQL to run faster.
  • Used OLAP and OLTP and Star schema to design data marts and data warehouse.
  • Worked on SQL Server Administration tasks using various backup strategies, database maintenance plan, user (NT/SQL) security authorizations, database creation, Tables, indexes.
  • Provided SQL Server database related technical insight on this project for migrating data warehouse.
  • Analyze differences between old and new system change requirements, create logical and physical database design based on required changes, and create/update technical design documents and production support system operations and maintenance guide responsible for estimation of work effort for project.
  • Overall Solution Design that includes end-to-end Process Flow, Web Services definitions/message formats/security. Provided ER diagrams and logical data models by using VISIO.
  • Migrated some Cognos reports into PowerBI reports and integrated into iPod and tablets.
  • Identified and optimized SQL statements in stored procedures/reports, which were pain points. There was a 35% improvement in the critical path stored procedures and 53% improvements in the reports.
  • Reviewed and optimized SSIS packages, SSRS reports and, NET application. Provided document with list of improvements and implemented successfully on Production.
  • Worked on performance of the SSRS reports and deployed in production environment.
  • Build cubes using SSAS and integrated to SharePoint. Users can user filters to generate reports.
  • We designed and developed huge data warehouse using ORACLE and RAC solution.
  • I have used PowerShell script, commands for SharePoint site administration.
  • Worked on Terra data scripts. Fixing issues and updating scripts and running query’s.
  • Managed and monitor integration database and system performance. That includes optimization and tuning of data loads, extractions, and the recommendation of appropriate changes to ensure and improve system performance and reliability.
  • Developed Merge jobs in Python to extract and load data into MySQL database

Environment: MicroStrategy, Python, Tableau, SAS, Netezza, Big Data, T-SQL, Hadoop, DB2, Azure SQL, MongoDB, Microsoft SQL Server, Cognos, Oracle 11i, PowerBI, Python, IBM Infosphere Datastage 8.5/8.1, Teradata12, MySQL, QlikView, MDM, PDW, SSAS, SSIS, SSRS, BIDS, Crystal Reports, Business Objects XI, Report Portal 3.0.

Confidential

Lead Solution Architect /DBACTI Business Intelligence

Responsibilities:

  • Gathering business requirement and Analyze and design and develop this huge project.
  • Provided SQL Server database related technical insight on this project for migrating data warehouse.
  • Analyze differences between old and new system change requirements, create logical and physical database design based on required changes, and create/update technical design documents and production support system operations and maintenance guide responsible for estimation of work effort for project.
  • Overall Solution Design that includes end-to-end Process Flow, Web Services definitions/message formats/security. Provided ER diagrams and logical data models by using VISIO.
  • Provided solution for logical data diagrams and strategies to bring the data from Source to Target by designing the ETL process by using work flows, data flows, quality transforms, filter the data as per requirements and loading the data in target.
  • Involved in discussions with clients/users to determine the Dimensions, Hierarchies and Levels, Basic Measures, Key Metrics for Relational, Multidimensional Data Model for the system and understanding report requirements. Designing Star/Snowflake Schema Data Models.
  • Developed database triggers and stored procedures using T-SQL cursors and tables.
  • Provided architecture artifacts for big data solutions and integration with other systems. Presented Blue print about Current state and Target state solution.
  • Provided solution to load data from different sources to target staging server and design data model using VISIO and providing logical and physical database diagrams and loading data using SSIS and DataStage ETL process.
  • Responsible for implementation and ongoing administration of Hadoop infrastructure.
  • Deploy new hardware and software environments required for Hadoop and to expand memory and disks on nodes in existing environments
  • Converted huge Cogon’s reports in to Business Objects and deployed in to production.
  • Created Hive Tables, loaded retail transactional data from Teradata using Scoop.
  • Installed UNIX and setup ODBC, DSN. Install third party drive SIMBA Hive and created remote data source to Hadoop to get data real time for reporting.
  • Designed cubes using dimensional Start schema’s and published on SharePoint.
  • Leading onshore and offshore teams in India and providing all the functional requirements and technical design documents and co-ordination the development.
  • Generated Complex reports in Cognos 10.1 report studio including Drill Down reports from DMR modeled Frame work model.
  • Analyzed Hadoop cluster using big data analytic tools including Kafka, Pig, Hive, Spark, Hadoop.
  • Wrote Python scripts to schedule jobs and data migration.
  • Integrated Business Objects dashboards with SharePoint. Imported all the reports data into our reporting database.
  • Some of the longer-running database processes. In some cases yielding a greater than 90% performance improvement.

Environment: VS 2005, 2008, SAS, Cognos, Python, T-SQL, Hadoop, Microsoft SQL Server, Cassandra, Cognos 8 & 10, Entitlement modeling, PowerBI, UNIX, IBM Infosphere Datastage 8.5/8.1, Teradata12, MySQL, QlikView, SSAS, SSIS, SSRS, BIDS, Crystal Reports, Business Objects XI, Report Portal 4.0.

Confidential

Lead DBA / BI Solution Architect

Responsibilities:

  • Understood the multiple systems in short ramp up time.
  • Reviewed BRD documents and provide suggestions and feedback.
  • Worked on Solution Architecture document for Confidential Digital Download Migration from LTG Infrastructure.
  • All the systems were developed in .NET, ASP.NET, AJAX, Web service, BizTalk, Oracle, SAS and MS SQL Server.
  • Used to travel and providing solutions and demos about the system process and solutions.
  • Migrated lot of legacy data from DB2, mainframe data to Oracle. Provided roadmap and technology solution to team and led the huge migration effort.
  • Transferred technical knowledge to two administrative personnel through providing onsite training, detailed best-practice documentation, architecture, and procedures for DB2 systems.
  • Hands on experience in installing, configuring, managing, upgrading and migrating, monitoring and troubleshooting SQL Server 2000/2005/2008 R2 and 2012.
  • Administered Salesforce Cloud based application for Sales, Marketing, and Support departments. Involved in creating multiple analytical reports, with varying degrees of complexity.
  • Designed and developed functional specifications, technical specifications for BOBJ reporting on top of SAP HANA Models.
  • We Design, Develop, Implementation & Supported, which includes Dynamics AX implementation, .NET development, customization, feature enhancements, and production support sales and distribution reports for business users.
  • Expertise in SQL Server Integration Services (SSIS) and SQL Server Reporting Services (SSRS) with good knowledge on SQL Server Analysis Services (SSAS).
  • Experience in report writing using SQL Server Reporting Services (SSRS) and creating various types of reports like drill down, Parameterized, Cascading, Conditional, Table, Matrix, Chart and Sub Reports.
  • Updated and wrote Python scripts for jobs and data back up and re-covey.
  • Created multidimensional star schemas. Define fact tables and dimensional.
  • Worked on Governance process and administration.
  • Experience in developing, monitoring, extracting and transforming data using DTS/SSIS, Import Export Wizard, and Bulk Insert.
  • Excellent in High Level Design of ETL SSIS Packages for integrating data using OLE DB connection from heterogeneous sources like (Excel, CSV, Oracle, flat file, Text Format Data) by using multiple transformations provided by SSIS such as Data Conversion, Conditional Split, Bulk Insert, Merge and union all.
  • Developing reports on SSRS on SQL Server (2000/2005/2008). Sound Experience and understanding of SSAS, OLAP cube and Architecture.
  • Experience in writing expressions in SSRS and Expert in fine tuning the reports. Created many Drill through and Drill Down reports using SSRS.
  • Review current state of Database’s on LTG. Providing solution how to migrate this date in to DF and Product Group. Mapping current state database schema with target state schema and providing solution for data migration.
  • Set up new WCF service and sending data though Oracle enterprise bus. Getting response back to the client.
  • Lead team to use libcURL and Apache to communicate via a scalable architecture with Public Cloud Storage that supports Amazon S3, Microsoft Azure, Cloud Services and Salesforce Cloud Services.
  • Provided solution for communicating multiple systems with Web services and WCF services.
  • Provided demo’s about current state and Target state systems and providing PAD and SAD documents to technical team.

Environment: VS 2005, 2008 and 2012. Python, Cloud, Salesforce, Cognos, Microsoft SQL Server 2008 and DB2, UNIX, SAP HANA 1.0, Business Objects, MySQL, AIX, Azure, SAS, SSAS, SSIS, SSRS, BIDS, CVS, VSS, Crystal Reports, Business Objects XI, Report Portal 3.0.

Confidential, Dallas, TX

Database developer

Responsibilities:

  • Deigned entire system for Income call log in information into Database.
  • Analyzing business requirements and designs/architects new IVR/Speech solutions
  • Involved as primary on-site ETL Developer during the analysis, planning, design, development, and implementation stages of projects using IBM Web Sphere software (Quality Stage v8.1, Web Service, Information Analyzer, Profile Stage, WISD of IIS 8.0.1).
  • Created Business, application, database and infrastructure diagrams using Visio.
  • Installation of Sql Server 2008 Standard edition and enabling SSIS/SSRS/SSAS on Analytical Development server.
  • Comprehensive Cognos Admin Review: Created multiple models to meet the need of adhoc and canned reports. Modified the existed model of Phase 1 and took care of all business needs for phase 2.
  • Developed multiple Webi reports for Predictive metrical availability using BI Launchpad. Created universe’s in IDT for all these new reports by using Design Studio. Wrote scripts to customize the some special columns.
  • Responsible for upgrades, web services, cloud services with Sales Force, databases, and system configuration for the infrastructure group of Quantum’s storage division.
  • Identified and prioritized a Cloud service feature roadmap including object versioning, conditional puts, offsite replication, and performance.
  • Used advanced features of T-SQL in order to design and tune T-SQL to interface with the Database and other applications in the most efficient manner and created stored Procedures for the business logic using T-SQL.
  • Leading about 25 developers onshore and offshore teams from India and USA. Used to provide design solutions and keep track the development. We also follow life cycle process deploying on Dev to Prod and communicate to both business and management.
  • Setup CIT and UAT database environments. I also backup and restore the databases on these environments and setup the security for database.
  • Worked with developers closely and support on database issues and provide the information about database access. Provided solution for RAC and RAT.
  • Analyzed the requirements and worked on Data modeling based on business requirements, designed all master and child tables and relations using VISIO. Also created physical database tables as per Visio diagram.
  • End to End HANA POC technical activities including loading data into HANA tables from Flat Files using Data Services, Modeling in HANA (Analytic and Calculated Views), creating Universes on top of HANA Views in IDT, Reports in WebI and Dashboards in Dashboard Designer.
  • Advanced scripting and ETL, data modeling (star schema); worked with large data sets.
  • Used complex QlikView functions and developing most optimal scripts for a given solution.
  • Good experience on SQL, relational databases and Dimensional Modeling, Optimization of data model for performance.
  • Backup and restore the data to new tables and also imported from flat files to new tables.
  • Set up SSRS and SSIS environments and Created many different types of reports and deployed on CIT and UAT. Supported business, how to use this reports.
  • Setup nightly running jobs for data replication from Production to UAT and CIT.
  • Tuning and troubleshoot the database issue and support team 24/7.
  • Created many complex SSIS packages for data import from other sources to our database’s. We also setup the email notification upon successfully execution of the job. I used to trouble shoot the data issues and support business.
  • Designed and developed SSRS Reports using visual studio. Wrote complex queries, stored procedures and functions to fetch the data from database.
  • Developed and supported SSRS reports system like Chart reports, Master detail, Drill down, drill through reports.
  • Designed and developed functional specifications, technical specifications for BOBJ reporting on top of SAP HANA Models.
  • Designed the database by analyzing the data requirements and created the entire relational database for this project.
  • Designed and developed reports in Cognos. Tuning the reports and queries for better performance.
  • Worked with tabular reports, matrix reports, gauges & chart reports, parameterized reports, sub reports, ad-hoc reports, drill down reports as well as interactive reports according to business requirements in time restricted environment using Crystal Reports and SQL server reporting services (SSRS).
  • Supported and worked on performance tuning for Oracle data warehouse.
  • Created ad hoc reports as requested and conduct ad hoc research using Crystal Report XI
  • Designed, developed, analyzed Crystal reports from the Business Objects Universe.
  • Migrated data from old environments to LLE. Backup and restore and setting up security to different groups and support from MS SQL Server 2005 to 2008.
  • Wrote SSIS packages to push data from production to LLE every single night.
  • Worked on Data purging, cleaning up the data which is older than 13 months. Archive the data and deleting data using SSIS packages.

Environment: VS 2005, 2008 and 2012. .NET Framework 2.0 and 3.5, C#. NET, Business Objects, Cognos, Microsoft SQL Server 2008 Python, Datastage, Oracle 10g, RAC. RAT, Teradata, T-SQL, PL/SQL, Perl, SSAS, SSIS, SSRS, BIDS, Crystal Reports, Business Objects XI, Internet Information Server (IIS), CVS, MS Excel 2007, Report Portal 3.0.

Confidential 

Lead Database Architect/DBA

Responsibilities:

 
  • Worked with project lead, analyst and business users to understand business processes, document project requirements and translate them into functional and non-functional specifications for BI reports and applications
  • Designed entire database as per business requirements using Visio. Provided the demo of the data Model and build new database as per Visio.
  • Developed SSRS reports like Chart reports, Master detail, Drill down, drill through reports.
  • Worked on data center migration from one data center to other data center. We have migrated huge Active reports project to Crystal reports.
  • Used Business Objects CrystalXcelsiusand Query as a Web Services (QAAWS) for data access.
  • We integrated Remedy system with Crystal reports. We automate this report system flexible to users. So users can enter parameters and generate the reports.
  • Designed and developed complicated reports using Crystal Reports.
  • Formulated and documented the physical ETL process design based on business requirements and system specifications with strong ETL design skills, including source to target mappings, transformations, lookups, aggregations, expressions.
  • Involved in creating specifications for ETL processes, finalized requirements and prepared specification document.
  • Used DTS/SSIS and T-SQL stored procedures to transfer data from OLTP databases to staging area and finally transfer into data marts and performed action in XML.
  • Used Business Objects Data Services to extract data from various sources including flat files of fixed format, Excel, XML, CSVs and databases like Oracle 10g/11g, MySQL and SQL Server.
  • Involved in deploying SSIS packages for various environments.
  • Created packages in SSIS with error handling as well as created complex SSIS packages using various data transformations like conditional split, fuzzy look up, for each loop, multi cast, column conversion, fuzzy grouping, and script components.
  • Designed and developed reports with running value, interactive sorting as well as conditional formatting.
  • Providing solution for design, technical, infrastructure, integration with other external systems like Cognos.
  • Reports, matrix reports, gauges & chart reports, parameterized reports, sub reports, ad-hoc reports, drill down reports as well as interactive reports according to business requirements in time restricted environment using SQL server reporting services (SSRS) and Crystal Reports.
  • Designed and developed report with running value, interactive sorting as well as conditional formatting

Environment: VS 2005, 2008, .NET Framework, C#. NET, Business Objects, T-SQL, Cognos,Crystal Reports, Microsoft SQL Server 2008/2005 Python, Navision, SSAS, SSIS, SSRS, BIDS, Data Stage 7.5.1 Enterprise Edition, Oracle 10g, Remedy 7.5, IIS, DTS, Data Modeling, ETL, MS Visio, UML, Web Services.

We'd love your feedback!