MapReduce Paradigm Using Large-Scale Ontologies
With the upcoming information downpour of semantic information, the quick development of ontology bases has acquired huge difficulties in performing proficient and adaptable thinking. Conventional incorporated thinking techniques are not adequate to process substantial ontologies. Disseminated thinking techniques are in this manner required to enhance the adaptability and execution of deductions. They propose an incremental and dispersed induction technique for substantial scale ontologies by utilizing MapReduce, which acknowledges elite thinking and runtime seeking, particularly for incremental information base. By developing exchange surmising backwoods and successful statement triples, the capacity is to a great extent diminished and the thinking procedure is disentangled and quickened. At last, a model framework is actualized on a Hadoop system and the exploratory outcomes approve the ease of use and adequacy of the proposed approach. Ontology mapping can give more right outcomes if the mapping procedure can manage vulnerability viably that is brought about by the inadequate and inconsistent data utilized and created by the mapping procedure. A survey was made for different reasoning approaches that focus on semantic inferences.
Keywords: Big data, MapReduce, ontology reasoning, resource description framework, semantic web
Cite this Article
Vasanthi K, Vinutha HP. MapReduce Paradigm Using Large-Scale Ontologies. Recent Trends in Parallel Computing. 2017; 4(2): 1–9p.
- There are currently no refbacks.
This site has been shifted to https://stmcomputers.stmjournals.com/