Wiki
We have compiled hundreds of related entries to help you understand "artificial intelligence"
AIOps refers to the use of big data, advanced analytics, and machine learning to enhance the operational and functional workflows of IT teams.
Apptainer is a container system for high performance computing (HPC), formerly known as Singularity. It is used to build and run Linux containers, packaging software, libraries, and runtime compilers in isolated environments.
The Beowulf cluster is a high-performance parallel computer cluster structure, which is assembled using cheap personal computer hardware to achieve the best cost-effectiveness.
Human-Feedback Reinforcement Learning is an advanced method for training AI systems that combines reinforcement learning with human feedback.
Batch processing is to combine a series of commands or programs together in sequence and execute them in batches in a batch file.
Computational fluid dynamics applications run most efficiently on high performance computing (HPC) systems with high-throughput local storage, low-latency networks, and optimized CPUs.
Data Science (DS) aims to extract valuable information, insights and knowledge from large-scale data.
Field Programmable Gate Array A Field Programmable Gate Array (FPGA) is a semiconductor device based on a matrix of configurable logic blocks (CLBs) connected by programmable interconnects. It can be reprogrammed after fabrication based on the desired application or functional requirements.
In computer computing, Remote Direct Memory Access (RDMA) is a direct memory access technology that transfers data directly from the memory of one computer to another without the intervention of the operating systems of both computers.
Hardware acceleration refers to the process of assigning very computationally intensive tasks to specialized hardware for processing in a computer, which can reduce the workload of the central processing unit and is more efficient than software running solely on a general-purpose CPU.
Parallel computing is a sub-field of high performance computing (HPC). Relative to serial computing, it is a computing mode that improves computing efficiency by executing multiple tasks on multiple processors or computers simultaneously.
High-throughput computing (HTC) is defined as a type of computing that aims to use resources to run a large number of computing tasks in parallel.
The term High Performance Computing (HPC) emerged after the term "supercomputing" and is a field of computing that uses powerful computing resources to solve complex problems.
The Big Language Model is an artificial intelligence algorithm that uses a neural network with a large number of parameters to process and understand human language or text using self-supervised learning techniques.
Output modulation is a method of transforming output representation and causing disturbances, which is often used to increase the diversity of learners. It is to transform classification output into regression output and then construct individual learners.
Random forest is a versatile algorithm that contains multiple decision trees.
Random walk is a statistical model consisting of a series of random action trajectories, which is used to represent irregular changes.
Neural Machine Translation (NMT) is a machine translation framework based on pure neural networks. It uses neural networks to achieve end-to-end translation from source language to target language.
The Neural Turing Machine is a Turing Machine based on a neural network. It is inspired by the Turing Machine and can implement a machine algorithm for differential functions. It includes a neural network controller and external memory.
The same strategy means that the strategy for generating samples is the same as the strategy used when the network updates parameters. A typical example of the same strategy method is the SARAS algorithm.
Receiver Operating Characteristic (ROC) is a test indicator of a system matching algorithm. It is a relationship between the matching score threshold, false positive rate, and rejection rate. It reflects the balance between the rejection rate and false positive rate of the recognition algorithm at different thresholds.
Restricted Boltzmann machine is a kind of random neural network model with two-layer structure, symmetrical connection and no self-feedback.
Simultaneous Localization and Mapping (SLAM) is a technique used in robotics.
Statistical learning is a discipline that builds probabilistic statistical models based on data to predict and analyze data, also known as statistical machine learning.
AIOps refers to the use of big data, advanced analytics, and machine learning to enhance the operational and functional workflows of IT teams.
Apptainer is a container system for high performance computing (HPC), formerly known as Singularity. It is used to build and run Linux containers, packaging software, libraries, and runtime compilers in isolated environments.
The Beowulf cluster is a high-performance parallel computer cluster structure, which is assembled using cheap personal computer hardware to achieve the best cost-effectiveness.
Human-Feedback Reinforcement Learning is an advanced method for training AI systems that combines reinforcement learning with human feedback.
Batch processing is to combine a series of commands or programs together in sequence and execute them in batches in a batch file.
Computational fluid dynamics applications run most efficiently on high performance computing (HPC) systems with high-throughput local storage, low-latency networks, and optimized CPUs.
Data Science (DS) aims to extract valuable information, insights and knowledge from large-scale data.
Field Programmable Gate Array A Field Programmable Gate Array (FPGA) is a semiconductor device based on a matrix of configurable logic blocks (CLBs) connected by programmable interconnects. It can be reprogrammed after fabrication based on the desired application or functional requirements.
In computer computing, Remote Direct Memory Access (RDMA) is a direct memory access technology that transfers data directly from the memory of one computer to another without the intervention of the operating systems of both computers.
Hardware acceleration refers to the process of assigning very computationally intensive tasks to specialized hardware for processing in a computer, which can reduce the workload of the central processing unit and is more efficient than software running solely on a general-purpose CPU.
Parallel computing is a sub-field of high performance computing (HPC). Relative to serial computing, it is a computing mode that improves computing efficiency by executing multiple tasks on multiple processors or computers simultaneously.
High-throughput computing (HTC) is defined as a type of computing that aims to use resources to run a large number of computing tasks in parallel.
The term High Performance Computing (HPC) emerged after the term "supercomputing" and is a field of computing that uses powerful computing resources to solve complex problems.
The Big Language Model is an artificial intelligence algorithm that uses a neural network with a large number of parameters to process and understand human language or text using self-supervised learning techniques.
Output modulation is a method of transforming output representation and causing disturbances, which is often used to increase the diversity of learners. It is to transform classification output into regression output and then construct individual learners.
Random forest is a versatile algorithm that contains multiple decision trees.
Random walk is a statistical model consisting of a series of random action trajectories, which is used to represent irregular changes.
Neural Machine Translation (NMT) is a machine translation framework based on pure neural networks. It uses neural networks to achieve end-to-end translation from source language to target language.
The Neural Turing Machine is a Turing Machine based on a neural network. It is inspired by the Turing Machine and can implement a machine algorithm for differential functions. It includes a neural network controller and external memory.
The same strategy means that the strategy for generating samples is the same as the strategy used when the network updates parameters. A typical example of the same strategy method is the SARAS algorithm.
Receiver Operating Characteristic (ROC) is a test indicator of a system matching algorithm. It is a relationship between the matching score threshold, false positive rate, and rejection rate. It reflects the balance between the rejection rate and false positive rate of the recognition algorithm at different thresholds.
Restricted Boltzmann machine is a kind of random neural network model with two-layer structure, symmetrical connection and no self-feedback.
Simultaneous Localization and Mapping (SLAM) is a technique used in robotics.
Statistical learning is a discipline that builds probabilistic statistical models based on data to predict and analyze data, also known as statistical machine learning.