News Release on Big Data Research: September-2018

Big data and technology in disasters: Better integration needed for effective response

Disasters are getting a lot of commonplace and sophisticated, and also the challenges for rescue and humanitarian organizations increase. more and more these teams communicate massive knowledge to assist offer solutions. Researchers needed to look at however ICT tools and massive knowledge was getting used in disaster responses. By conducting a structured literature search and developing {a knowledge|a knowledge|an information} extraction tool on the employment of ICT and massive data throughout disasters they showed that some vital gaps exist that ought to be a part of a future analysis focus. [1]

A scalable and distributed dendritic cell algorithm for big data classification

In the era of huge knowledge, scaling evolution up to large-scale knowledge sets could be a terribly fascinating and difficult task. the appliance of ordinary biological systems in such knowledge sets isn’t easy. Therefore, a brand new category of climbable biological systems that embraces the massive storage and process capability of distributed platforms is needed. during this work, we have a tendency to concentrate on the nerve fiber Cell rule (DCA), a bio-inspired classifier, and its limitation once addressing terribly giant knowledge sets. [2]

Big Data Driven Agricultural Products Supply Chain Management: A Trustworthy Scheduling optimization Approach

Big information is promoting the event of offer chain style and management. the matter of trustworthy planning by mistreatment huge information is difficult and it considerably influences the performance of agricultural product offer chain (APSC) management. Currently, their area unit varied approaches to optimize planning of APSC, however, most of them will solely tackle the matter with primary objectives (time and cost) or area unit restricted to small-scale offer chains. The economic approaches haven’t been provided for the planning of APSC in huge information surroundings. This paper aims at proposing a completely unique trustworthy planning optimization approach for APSC by mistreatment of huge information. Firstly, a brand new management design is provided for revealing underexploited values from huge information to support the planning of APSC. [3]

Practical Big Data Analytics: Hands-on techniques to implement enterprise analytics and machine learning using Hadoop, Spark, NoSQL and R

Get command of your structure huge information exploitation the ability of information science and analytics Key options an ideal companion to spice up your huge information storing, processing, analyzing skills to assist you’re taking sophisticated business selections Work with the simplest tools similar to Apache Hadoop, R, Python, and Spark for NoSQL platforms to perform large on-line analyses Get skilled tips about applied mathematics reasoning, machine learning, mathematical modeling, and information mental image for giant informationBook Description huge Data analytics relates to the ways utilized by organizations to gather, organize and analyze massive amounts of information to uncover valuable business insights that otherwise cannot be analyzed through ancient systems. Crafting AN enterprise-scale cost-effective huge information and machine learning answer to uncover insights and worth from your organization’s information may be a challenge. Today, with many new huge information systems, machine learning packages, and metal Tools, choosing the correct combination of technologies is an excellent larger challenge. This book can assist you to do this. With the assistance of this guide, you’ll be able to bridge the gap between the theoretical world of technology with the sensible ground reality of building company huge information and information science platforms. [4]

Improved FTWeightedHashT Apriori Algorithm for Big Data using Hadoop-MapReduce Model

The most vital drawback of information mining is that the frequent itemset mining on massive datasets. The known basic algorithmic program for frequent mining itemset is Apriori. because of the drawbacks of Apriori algorithmic program, several enhancements are done to create Apriori higher, economical and quicker. we’ve reviewed over a hundred papers involving this work that embody enhancements be done to boost Apriori algorithmic program. Weighted primarily based} Apriori and Hash Tree based Apriori ar the foremost vital enhancements. one among the recent papers integrated the load construct of weighted Apriori and Hash tree construction construct of Hash Tree Apriori to supply a hybrid Apriori algorithmic program named WeightedHashT. [5]

Reference

[1] Big data and technology in disasters: Better integration needed for effective response

Date: August 22, 2018, Source: Society for Disaster Medicine and Public Health, Inc. (web link)

[2] A scalable and distributed dendritic cell algorithm for big data classification

Dagda ZC. A scalable and distributed dendritic cell algorithm for big data classification. Swarm and Evolutionary Computation. 2018 Sep 1. (web link)

[3] Big Data-Driven Agricultural Products Supply Chain Management: A Trustworthy Scheduling optimization Approach

Tao Q, Gu C, Wang Z, Rocchio J, Hu W, Yu X. Big Data-Driven Agricultural Products Supply Chain Management: A Trustworthy Scheduling optimization Approach. IEEE Access. 2018 Aug 30. (web link)

[4] Practical Big Data Analytics: Hands-on techniques to implement enterprise analytics and machine learning using Hadoop, Spark, NoSQL and R

Dasgupta N. Practical Big Data Analytics: Hands-on techniques to implement enterprise analytics and machine learning using Hadoop, Spark, NoSQL and R. (web link)

[5] Improved FTWeightedHashT Apriori Algorithm for Big Data using Hadoop-MapReduce Model

Sarem M. Ammar1* and Fadl M. Ba-Alwi2

1Departement of IT, Yemen Academic for Graduate Studies, Yemen.

2Faculty of Computer & IT, Sana’a University, Yemen. (web link)

Leave a Reply

Your email address will not be published. Required fields are marked *

Back To Top