Digital Business Integration Architect/consultant Resume
5.00/5 (Submit Your Rating)
SUMMARY
- More than six years of professional experience in design and development of Big Data tools, cloud computing, Amazon Web Services, data analytics and on various big data components like Spark, Oozie, Flume, Sqoop and Kafka, Hive
- Well versed with Big Data ecosystem and tools necessary to design and build a solution from scratch
- Experienced in setting up and administrating Big Data clusters of various distributions such as Cloudera and Hortonworks
- Worked as a team lead to analyze problems, think of solutions, build working models, and propose the same to a client
- Extensively researched cloud computing technologies and worked with providers like Microsoft and Amazon to deploy Big Data tools on the cloud
- Worked on real - time processing systems using technologies such as Spark and Storm while ingesting data using various Messaging queues like Kafka, Hive and Rabbit MQ, and analyzed data using R
- Manage and lead the team to design and implement various legacy modernization projects and other solutions around big data environment while making sure all regulations are maintained and all documentation is done
- Successfully modified architectures on basis of usage and performance to optimize the cost by 40 percent while maintaining the performance
TECHNICAL SKILLS
- Requirement gathering, elicitation skills, gap analysis, use case scenarios, SWOT analysis, high and low level business documentation, UML based documentation, consulting, project management
- Qlikview, D3.js, Tableau, Datameer
- Hadoop, Spark, Sqoop, Storm, Kafka, Hive, Hazelcast, R, Mahout
- WorldCheck, LexisNexis
- Balsamiq, html, Adobe Photoshop CS 5, Adobe InDesign CS 5
- JAVA, Python, MySQL, Hive, HBase, MongoDB, Git, IBM RAD, Eclipse, Amazon Web Services, Microsoft Azure, Rabbit MQ
- UNIX (Ubuntu, Red Hat, CentOS), Windows
- Visio, Project
- GCP (Google Cloud Platform), Microsoft Azure, Amazon Web Services
PROFESSIONAL EXPERIENCE
Digital Business Integration Architect/Consultant
Confidential
Responsibilities:
- Worked with a financial and Insurance client on architectural design and implementation of IFRS 17 (International Financial Reporting Standard) framework which is targeted to revise the way company predictions are made by 2020
- Created an infrastructure-based architecture for implementation of policy valuation projects for the China business unit, keeping in mind the regional data related policies and budgets involved
- Helped convert a hybrid (On cloud and on premise) architecture to a complete On Cloud one. Budgeting was considered to ensure that costs don't go up drastically. Worked on bench-marking of ViaNet21 (Azure distributor in China) to make suggestions based on usage. Architecture was approved by the business without any changes.
- Deployment of all Big Data solutions on Cloud (specifically Amazon Web Services)
- Negotiated with vendors directly to ensure proper use of Informatica (ICS) licenses and at the same time evaluated the current usage to get rid of any redundancies and ensure optimal usage
- Currently working on architecture and implementation of various systems related to HR (Payroll and time sheets), finance (Project related financing, business division financing) and reporting to ensure better used of Enterprise Hadoop Data Lake, Spark, Hive.
- Work closely with developers and security team to ensure data security and data integrity is maintained. Establishing best practices and ethics for the same
Data Engineering/Manager Data Analytics
Confidential
Responsibilities:
- Formulate solution architecture for upcoming projects based on big data and optimize them to give best results in least expected time
- Proof of concept to work on big data use cases on various big data components like Spark, Oozie, Flume, Sqoop and Kafka, Hive
- Provide suitable solutions to problems the bank is currently facing in ongoing projects that are implementing Big Data framework and Data Science and Java Applications.
- Help run Data Analytics algorithms like FP Growth and Association rule to generate recommendations and cross-sell products on basis of trends
- Major contributions in various projects where data has to be ingested in real time as well as batch models while providing suggestions based on various big data components like Spark, Oozie, Flume, Sqoop and Kafka, Hive
- Created architectures to implement data ingestion into Hadoop based Data Lake while making sure that all protocols are followed and no data regulations are hampered. Worked closely with Data Security team to ensure that data integrity was maintained
- Performed server assessment, resources estimation, time estimation before initiating the project and presenting the same to Solutions Review Committee to get their approval
- Modified architecture to optimize the cost of the project by 40 percent while making sure end users were not impacted by the same
Data Engineering/Technical Specialist co-op
Confidential
Responsibilities:
- Evaluated system potential by testing compatibility of new programs with existing programs
- Worked on expansions or enhancements by studying work load and capacity of computer system for Java Applications.
- Evaluated vendor-supplied software by studying user objectives; testing software compatibility with existing hardware and programs
- Placed software and hardware into production by loading Java Application software into computer and doing necessary hardware connections and entering necessary commands
- Maximized use of hardware and software by training users; interpreting instructions; answering questions
- Creating catalogues and documentation to help users who want to try out solutions themselves to reduce the learning curve
Senior Software Engineer
Confidential
Responsibilities:
- Prepared PoCs (Proof of Concept) for clients to include Honda North America, Citibank Singapore, Barclays South Africa and other in-house clients
- Worked with the team that was awarded as the best analytics solution for our offering of Telematics based analysis for vehicular data and Java Applications. Primarily worked om design and implementation for the same
- Designed and implemented an end to end solution for Java Applications that included ingestion of data using Kafka, Hive and flume and processed the same in both Storm and Spark on basis of client need
- Managed/taught a team of engineers to Design and deploy a PoC for a common Big Data framework
- Analyzed customer requirements to determine product feasibility
- Worked on FinCrime Tool (Financial Crime Analytics) for Barclays South Africa. Created an architecture on basis on existing tools that client was using
- Showcased leadership capabilities by guiding the team to deploy projects onsite in US while making sure their work life balance was not impacted