1. Edge computing has emerged as a promising technique for providing low-latency computation offloading services for resource-limited mobile user devices and Internet of Things applications.
2. This paper proposes a multiple algorithm service model (MASM) that provides heterogeneous algorithms with different computation complexities and required data sizes to fulfill the same task, and an optimization model that aims at reducing the energy and delay cost by optimizing the workload assignment weights and computing capacities of virtual machines.
3. Numerical results obtained demonstrate the effectiveness of the proposed method, showing that energy and delay costs can be significantly reduced by sacrificing the QoR of the offloaded AI tasks.
The article is generally trustworthy in its reporting, as it provides evidence for its claims in the form of numerical results obtained from experiments conducted using the proposed MASM model. The authors also provide references to relevant literature to support their arguments, which adds to its credibility. However, there are some potential biases in the article due to its focus on promoting their own proposed solution without exploring other possible solutions or counterarguments. Additionally, there is no discussion on potential risks associated with using this model or any other edge AI solutions, which could be seen as a missing point of consideration. Furthermore, while the authors do mention some advantages of edge computing such as low latency and energy savings, they do not explore any potential drawbacks or challenges associated with it. All in all, while this article is generally reliable in terms of providing evidence for its claims, it could benefit from exploring more counterarguments and potential risks associated with edge AI solutions.