当前位置: 首页 > news >正文

Day 75:通用BP神经网络 (2. 单层实现)

代码:

package dl;import java.util.Arrays;
import java.util.Random;/*** Ann layer.*/
public class AnnLayer {/*** The number of input.*/int numInput;/*** The number of output.*/int numOutput;/*** The learning rate.*/double learningRate;/*** The mobp.*/double mobp;/*** The weight matrix.*/double[][] weights;/*** The delta weight matrix.*/double[][] deltaWeights;/*** Error on nodes.*/double[] errors;/*** The inputs.*/double[] input;/*** The outputs.*/double[] output;/*** The output after activate.*/double[] activatedOutput;/*** The inputs.*/Activator activator;/*** The inputs.*/Random random = new Random();/************************ The first constructor.** @param paraActivator*            The activator.**********************/public AnnLayer(int paraNumInput, int paraNumOutput, char paraActivator,double paraLearningRate, double paraMobp) {numInput = paraNumInput;numOutput = paraNumOutput;learningRate = paraLearningRate;mobp = paraMobp;weights = new double[numInput + 1][numOutput];deltaWeights = new double[numInput + 1][numOutput];for (int i = 0; i < numInput + 1; i++) {for (int j = 0; j < numOutput; j++) {weights[i][j] = random.nextDouble();} // Of for j} // Of for ierrors = new double[numInput];input = new double[numInput];output = new double[numOutput];activatedOutput = new double[numOutput];activator = new Activator(paraActivator);}// Of the first constructor/*********************** Set parameters for the activator.** @param paraAlpha*            Alpha. Only valid for certain types.* @param paraBeta*            Beta.* @param paraAlpha*            Alpha.*********************/public void setParameters(double paraAlpha, double paraBeta, double paraGamma) {activator.setAlpha(paraAlpha);activator.setBeta(paraBeta);activator.setGamma(paraGamma);}// Of setParameters/*********************** Forward prediction.** @param paraInput*            The input data of one instance.* @return The data at the output end.*********************/public double[] forward(double[] paraInput) {//System.out.println("Ann layer forward " + Arrays.toString(paraInput));// Copy data.for (int i = 0; i < numInput; i++) {input[i] = paraInput[i];} // Of for i// Calculate the weighted sum for each output.for (int i = 0; i < numOutput; i++) {output[i] = weights[numInput][i];for (int j = 0; j < numInput; j++) {output[i] += input[j] * weights[j][i];} // Of for jactivatedOutput[i] = activator.activate(output[i]);} // Of for ireturn activatedOutput;}// Of forward/*********************** Back propagation and change the edge weights.** @param paraTarget*            For 3-class data, it is [0, 0, 1], [0, 1, 0] or [1, 0, 0].*********************/public double[] backPropagation(double[] paraErrors) {//Step 1. Adjust the errors.for (int i = 0; i < paraErrors.length; i++) {paraErrors[i] = activator.derive(output[i], activatedOutput[i]) * paraErrors[i];}//Of for i//Step 2. Compute current errors.for (int i = 0; i < numInput; i++) {errors[i] = 0;for (int j = 0; j < numOutput; j++) {errors[i] += paraErrors[j] * weights[i][j];deltaWeights[i][j] = mobp * deltaWeights[i][j]+ learningRate * paraErrors[j] * input[i];weights[i][j] += deltaWeights[i][j];} // Of for j} // Of for ifor (int j = 0; j < numOutput; j++) {deltaWeights[numInput][j] = mobp * deltaWeights[numInput][j] + learningRate * paraErrors[j];weights[numInput][j] += deltaWeights[numInput][j];} // Of for jreturn errors;}// Of backPropagation/*********************** I am the last layer, set the errors.** @param paraTarget*            For 3-class data, it is [0, 0, 1], [0, 1, 0] or [1, 0, 0].*********************/public double[] getLastLayerErrors(double[] paraTarget) {double[] resultErrors = new double[numOutput];for (int i = 0; i < numOutput; i++) {resultErrors[i] = (paraTarget[i] - activatedOutput[i]);} // Of for ireturn resultErrors;}// Of getLastLayerErrors/*********************** Show me.*********************/public String toString() {String resultString = "";resultString += "Activator: " + activator;resultString += "\r\n weights = " + Arrays.deepToString(weights);return resultString;}// Of toString/*********************** Unit test.*********************/public static void unitTest() {AnnLayer tempLayer = new AnnLayer(2, 3, 's', 0.01, 0.1);double[] tempInput = { 1, 4 };System.out.println(tempLayer);double[] tempOutput = tempLayer.forward(tempInput);System.out.println("Forward, the output is: " + Arrays.toString(tempOutput));double[] tempError = tempLayer.backPropagation(tempOutput);System.out.println("Back propagation, the error is: " + Arrays.toString(tempError));}// Of unitTest/*********************** Test the algorithm.*********************/public static void main(String[] args) {unitTest();}// Of main
}// Of class AnnLayer

结果:

 

http://www.lryc.cn/news/113745.html

相关文章:

  • PHP序列化,反序列化
  • Android google admob Timeout for show call succeed 问题解决
  • EFLFK——ELK日志分析系统+kafka+filebeat架构
  • C# MVC controller 上传附件及下载附件(笔记)
  • 安装element-plus报错:Conflicting peer dependency: eslint-plugin-vue@7.20.0
  • 【操作系统】进程和线程对照解释
  • 4用opencv玩转图像2
  • Swagger的使用
  • python高阶技巧
  • Linux和Windows安装MySQL服务
  • Vue3 第四节 自定义hook函数以及组合式API
  • 门面模式(C++)
  • ASP.NET Core SignalR
  • auto-changelog的简单使用
  • map 比较(两个map的key,value 是否一样)
  • LayUI之入门
  • 【Linux】Linux下git的使用
  • QT读写配置文件
  • 【计算机网络】12、frp 内网穿透
  • Pytest 重复执行用例插件----pytest-repeat
  • 【软件工程】5 ATM系统测试
  • opencv读取MP4文件和摄像头数据
  • Qt实现自定义QDoubleSpinBox软键盘
  • 小研究 - 微服务系统服务依赖发现技术综述(一)
  • 2023-08-07力扣今日八题
  • Segment Anything【论文翻译】
  • 银河麒麟QT连接DM8数据库
  • 并发编程1:线程安全性概述
  • (论文复现)DeepAnt模型复现及应用
  • 【机器学习】在 MLOps构建项目 ( MLOps2)