Track topics on Twitter Track topics that are important to you
To eliminate the possibilities of getting various contradicting solutions to a single problem during diagnosis, a single regular Agent oriented Approach (AoA) is replaced by Intelligent Artificial Agents that act like human and even dynamically decide in any situations known as Intelligent Searching Approach (ISA) is proposed. These agents are used to analyse the medical forums and results or findings are derived accurately than any manual approach. Multiple Agents have been used to analyse the blogs by dividing the work areas and communicating themselves using Agent Communication Language (ACL) and FIPA. The local solutions thus formed are forwarded to a global agent. This Global Agent controls all operations and makes the decision about the best solution. As the Global Agent controls all other agents, it eradicates unwanted and ineffective communication between the various local agents and hence keeping the time taken for communication at the minimum level. Based on these solutions a prioritization matrix is formed using advanced clustering techniques to create a prioritized content of suggested best solutions. Once the decision is made, the refining process runs several times recursively checking for all possible better solutions solving the input. On completion of this process, the Global Agent returns the exact result of the discussion. This process saves time rather than researching the entire blog for result data. This advanced approach lights a different way of obtaining solution keeping the time taken for discussion and intercommunication between the agents to the minimal level but not compromising on the perfection of the solution at the same time.
This article was published in the following journal.
Name: Journal of medical systems
The Nuclear Metrology Laboratory (LMN) - IPEN, São Paulo, Brazil - developed a Digital Coincidence System (DCS), based on the Coincidence Counting Methodology, in order to improve its capabilities in...
Magnetoencephalography (MEG) is an advanced magnetic source imaging technology that measures the magnetic fields produced by neural activities. It has been extensively used in scientific research and ...
The recent improvements in mass spectrometry instruments and new analytical methods are increasing the intersection between proteomics and big data science. In addition, bioinformatics analysis is bec...
The design and implementation of a software package for the analysis of single-crystal NMR data is presented. The SCFit software can treat spectra arising from various interactions: (i) chemical shift...
Gas chromatography coupled with mass spectrometry can provide an extensive overview of the metabolic state of a biological system. Analysis of raw mass spectrometry data requires powerful data process...
This is a study funded by a Phase II SBIR to build and test the feasibility of a new software prototype for delivering evidence-based behavioral therapy treatment for autism. The software,...
The DELPhi system is a software device that is used for the noninvasive evaluation of brain plasticity and connectivity. The DELPhi software uses EEG and TMS devices as accessories. Standa...
The investigators proposed to conduct a retrospective study to evaluate the impact of the implementation of apTeleCare software on the management of dialysis patients at home in terms of t...
The purpose of this project is to validate quantitative lung structure assessment using an automated analysis software (VIDA), for application to low dose computed tomography (LDCT) acquir...
Aim of the project is the development of a software for implementation of anesthesiology guidelines in the preoperative evaluation. The software bases on the 2014 and 2018 guidelines of th...
Systematic gathering of data for a particular purpose from various sources, including questionnaires, interviews, observation, existing records, and electronic devices. The process is usually preliminary to statistical analysis of the data.
Software designed to store, manipulate, manage, and control data for specific uses.
Software used to locate data or information stored in machine-readable form locally or at a distance such as an INTERNET site.
Specifications and instructions applied to the software.
The act of testing the software for compliance with a standard.