• Blog Home
  • Tech Talk
    • Best Practices
    • Java
    • .NET
    • Mobile
    • UI/ UX
    • Systems Engineering
    • Quality Assurance
  • ClubM

Sign in

  • Mazarin Corporate Site »
Mazarin Blog
stay connected
Join us on Facebbook! Follow Us on Twitter! Subscribe to our RSS Feed!
Jul
20
2016
Tech Talk

Importance of Big Data and Managing Data with Elasticsearch

Author Article by Nipuna    Comments No Comments

Accessible through an extensive and elaborate API, Elasticsearch can power extremely fast searches that support your data discovery applications.

What Is Big Data?

Data management for organizations of all sizes has shifted from an important competency to a critical differentiator that can determine market winners and “has beens”. Fortune 1000 companies and government bodies are starting to benefit from the innovations of the web pioneers.

These organizations are defining new initiatives and re-evaluating their existing strategies to examine how the transformation of their business process can be done using this innovative concept of Big Data. In the process, it is learnt not only as a single technology, technique or initiative but also as a trend that can be used to benefit across many areas of business using the newest technology available today.

Various types of activities can be identified which could generate Big Data in our day to day activities, such as conversion data, sensor data, multimedia data, etc. Big Data refers to technologies and initiatives that involve diverse data, rapid-changing or massive for conventional technologies and skills and infrastructure to address efficiently.

Specifically, Big Data is related to data creation, storage, retrieval and analysis that are remarkable in terms of volume, velocity, and variety:

  • Volume.

A typical PC might have had 10 gigabytes of storage in the year 2000.Today, Facebook ingests 500 terabytes of new data every day. A Boeing 737 will generate 240 terabytes of flight data during a single flight across the US while the proliferation of smartphones, the data they create and consume; sensors embedded into everyday objects will soon result in billions of new, constantly-updated data feeds containing environmental, location, and other information, including video.

  • Velocity.

Clickstreams and ad impressions capture user behavior at millions of events per second; high-frequency stock trading algorithms reflect market changes within microseconds; machine to machine processes exchange data between billions of devices; infrastructure and sensors generate massive log data in real-time; online gaming systems support millions of concurrent users, each producing multiple inputs per second.

  • Variety.

Traditional database systems are designed to address smaller volumes of structured data, fewer updates or a predictable, consistent data structure. As applications have evolved to serve large volumes of users and as application development practices have become agile, the traditional use of the relational database has become a liability for many companies, rather than an enabling factor in their business.

What companies need to consider when getting started with Big Data?

With rapidly changing market requirements and competition, companies develop their Big Data business cases.The platform and speed discussions become only a part of the overall conversation about the delivery of Big Data, but In reality, following seven important steps can be identified that are necessary for realizing the full potential of Big Data.

  • Collect: Data is collected from the data sources and distributed across multiple nodes, often a grid, each of which processes a subset of data in parallel.

Image source: http://www.xoriant.com/sites/default/files/styles/large/public/big-data.jpg

  • Process: The system then uses that same high-powered parallelism to perform fast computations against the data on each node. Next, the nodes reduce the resulting data findings into more consumable data sets to be used by either a human being (in the case of analytics) or machine (in the case of large-scale interpretation of results).

Image source: http://image.slidesharecdn.com/ubuntuparty/95/opensource-frameworks-and-bigdata-processing-7-638.jpg

  • Manage: Often the Big Data being processed is heterogeneous, originating from different transactional systems. Nearly all of that data needs to be understood, defined, annotated, cleansed and audited for security purposes.

Image source: http://www.unex.com/wp-content/uploads/2016/03/Managing-Big-Data-in-the-Manufacturing-Industry.jpg

  • Measure: Companies will often measure the rate at which data can be integrated with other customer behaviors or records and whether the rate of integration or correction is increasing over time. Business requirements should determine the type of measurement and the ongoing tracking.

Image source: http://resources.simafore.com/hs-fs/hub/64283/file-15323761-png/images/measuring-information.png

  • Consume: The resulting use of the data should fit in with the original requirement for the processing. For instance, if bringing in a few hundred terabytes of social media interactions demonstrates whether and how social media data delivers additional product purchases, then there should be rules for how social media data is accessed and updated. This is equally important for machine-to-machine data access.

Image source: https://dropboxforbusinessblog.files.wordpress.com/2014/10/tip_of_the_week_remote-wipe1.png

  • Store: As the “data-as-a-service” trend takes shape, increasingly the data stays in a single location, while the programs that access it move around. Whether the data is stored for short-term batch processing or longer-term retention, storage solutions should be deliberately addressed.

Image source: http://lockmedown.com/wp-content/uploads/2014/09/Table-Storage-Feature-Part-2.jpg

  • Govern: Data governance comprehends the policies and oversight of data from a business perspective. As defined, data governance applies to each of the six preceding stages of Big Data delivery. By establishing processes and guiding principles, governance sanctions behaviors around data and Big Data needs to be governed according to its intended consumption. Otherwise, the risk is disaffection of constituents, not to mention over investment.

Image source: https://hubdesigns.files.wordpress.com/2011/03/data-governance1.jpg

How to make sense of Big Data?

The final result of handling Big Data is to make sense of your Big Data and the most important thing is the way it should be presented. There are different ways that can be used to represent our Big Data to the end users. Following are some of the commercial and open source applications available for this sole purpose.

Image source: http://www.ebcont-et.com/files/ebcont/ebcont-orange/marklogic_bigdata.jpg

 

What is Elasticsearch?

Accessible through an extensive and elaborate API, Elasticsearch can power extremely fast searches that support your data discovery applications. Elasticsearch is a schema-less database that has powerful search capabilities and is easy to scale horizontally.

 

Benefits of using Elasticsearch

Conventional SQL database managements systems are really designed for full-text searches, and they certainly perform well against loosely structured raw data that resides outside the database. On the same hardware, queries that would take more than 10 seconds using SQL will return results in under 10 milliseconds in Elasticsearch.

During an indexing operation, Elasticsearch converts raw data such as log files or message files into internal documents and stores them in a basic data structure similar to a JSON object. Each document is a simple set of correlating keys and values: the keys are strings, and the values are one of numerous data types—strings, numbers, dates, or lists.

All fields are indexed by default, and all the indices can be used in a single query, to easily return complex results at breathtaking speed

Adding documents to Elasticsearch is easy — and it’s easy to automate. Simply do an HTTP POST that transmits your document as a simple JSON object? Searches are also done with JSON: send your query in an HTTP GET with a JSON body. Important to remember that Elasticsearch is a relational database, so DBMS concepts usually apply. The most important concept that you must set aside when coming over from conventional databases is normalization.

Native Elasticsearch permit joins or sub queries, so denormalizing your data is essential. Elasticsearch can scale up to thousands of servers and accommodate petabytes of data. Its enormous capacity results directly from its elaborate distributed architecture. Elasticsearch is API driven almost any action can be performed using a simple RESTful API using JSON over HTTP. Client libraries are available for many programming languages.

 

Indexing Big Data with Elasticsearch

This leads to a real challenge; to act ahead and figure out a way to store and analyze huge amounts of data as efficiently as possible and to make that data easily searchable for our users, by providing them with advanced queries. Elasticsearch is schema-free and document-oriented. It stores complex real world entities in Elasticsearch as structured JSON documents.

“Elastic is document-oriented, meaning that it stores entire objects or ‘documents’. It not only stores them, but also indexes the contents of each document in order to make them searchable. You also have full control to customize how your data is indexed. It simplifies the analytics process by improving the speed of data retrieval process on a database table.

 

References:

  • https://www.mongodb.com/big-data-explained
  • http://searchbusinessanalytics.techtarget.com/feature/Five-first-steps-to-creating-an-effective-big-data-analytics-program
  • http://www.slideshare.net/BernardMarr/140228-big-data-slide-share/1

 

Authors: Shamali Kurukulasuriya, Sashika Robertson and Nipuna Gomes

Related Post

How to use SoapUI NextGenPro to test RESTful web services?
What is NFC – The Ultimate Guide
Mobile Payment Using NFC – Near Field Communication
Mazarin Foodies 2014
What Power BI Can Do – Major Benefits
Productive Development With React Redux
Data Mining using SQL Server Analysis Server
Outbound Training 2013

On this Page

  • What Is Big Data?
  • What companies need to consider when getting started with Big Data?
  • How to make sense of Big Data?
  • What is Elasticsearch?
  • Benefits of using Elasticsearch
  • Indexing Big Data with Elasticsearch
  • Related Post
Tags: Big Data, Big Data Analysis, Elasticsearch, Indexing Big Data, Mazarin
Did you enjoy reading this article? Share it! Share on Facebook Tweet this! Bookmark on Delicious StumbleUpon Digg This!

Related Posts

  • Serverless Architecture with AWS Lambda
  • Let’s move to NoSQL Databases with MongoDB – Mazarin
  • Without Redux and with Redux application state behaviorProductive Development With React Redux
  • Elements of CultureCompany Culture
avatar

About the Author: Nipuna Gomes

Nipuna is an experienced professional and a Web & Database enthusiast with skills in ASP .NET MVC4 C# , VB.NET , along with Database Management systems. Possessing over 04 years of experience on Microsoft Windows based Desktop , Web & Data related technologies. Currently he is stationed as a Associate Software Engineer at Mazarin (Pvt) Ltd

Leave a comment

Click here to cancel reply.

CAPTCHA
Refresh

*

Follow Us on Twitter!

On this Page

  • What Is Big Data?
  • What companies need to consider when getting started with Big Data?
  • How to make sense of Big Data?
  • What is Elasticsearch?
  • Benefits of using Elasticsearch
  • Indexing Big Data with Elasticsearch

Related Post

Sass and LESS: An Introduction to CSS Preprocessor...
Azure Functions – Learn more about it
Firebase – Mobile Application Development &#...
Serverless Architecture with AWS Lambda
Let’s move to NoSQL Databases with MongoDB &...
Productive Development With React Redux
Beginners’ Guide to CSS (CSS for dummies)
Company Culture
What is Docker ? Getting Started with Docker
Hybrid Mobile App Development with Ionic and Angul...
Test Automation of Mobile Applications using Appiu...
What Power BI Can Do – Major Benefits
Data Mining using SQL Server Analysis Server
Learn Cucumber Test Automation with Ruby Core Fram...
How to Succeed With Designing Scalable Web Apps
An Introduction to Node.js – Kickstarter
MS SQL Server BI (Business Intelligence)
How To Start Cloud Computing with AWS
What is NFC – The Ultimate Guide
5 Principles: How To Use Lean Startup Towards A Su...
Avatars by Sterling Adventures

Team Mazarin

A team of individuals dedicated to share common goals and vision of the company. Mazarin's endowed team consists of Managers, Software Engineers, User Interface Engineers, Business Analysts, Finance and Administration. We are a blend of quality people. We strive to maintain the open culture and work in close association. The way we work enables everyone to contribute while feeling contented sharing opinions and ideas to deliver the best software solutions.

Read More

Mazarin © 2023. All Rights Reserved.