当前位置: 首页 > news >正文

python、numpy、pytorch中的浅拷贝和深拷贝

1、Python中的浅拷贝和深拷贝

import copya = [1, 2, 3, 4, [11, 22, 33, [111, 222]]]
b = a
c = a.copy()
d = copy.deepcopy(a)print('before modify\r\n a=\r\n', a, '\r\n','b = a=\r\n', b, '\r\n','c = a.copy()=\r\n', c, '\r\n','d = copy.deepcopy(a)\r\n', d, '\r\n')
 before modify
 a=
 [1, 2, 3, 4, [11, 22, 33, [111, 222]]] 
 b = a=
 [1, 2, 3, 4, [11, 22, 33, [111, 222]]] 
 c = a.copy()=
 [1, 2, 3, 4, [11, 22, 33, [111, 222]]] 
 d = copy.deepcopy(a)
 [1, 2, 3, 4, [11, 22, 33, [111, 222]]] 

注:图片网址Python Tutor code visualizer: Visualize code in Python, JavaScript, C, C++, and Java

a[0] = 10
print('after a[0] = 10\r\n a=\r\n', a, '\r\n','b = a=\r\n', b, '\r\n','c = a.copy()=\r\n', c, '\r\n','d = copy.deepcopy(a)\r\n', d, '\r\n')
 after a[0] = 10
 a=
 [10, 2, 3, 4, [11, 22, 33, [111, 222]]] 
 b = a=
 [10, 2, 3, 4, [11, 22, 33, [111, 222]]] 
 c = a.copy()=
 [1, 2, 3, 4, [11, 22, 33, [111, 222]]] 
 d = copy.deepcopy(a)
 [1, 2, 3, 4, [11, 22, 33, [111, 222]]] 
a[4][0] = 100
print('after a[4][0] = 100\r\n a=\r\n', a, '\r\n','b = a=\r\n', b, '\r\n','c = a.copy()=\r\n', c, '\r\n','d = copy.deepcopy(a)\r\n', d, '\r\n')
after a[4][0] = 100
 a=
 [10, 2, 3, 4, [100, 22, 33, [111, 222]]] 
 b = a=
 [10, 2, 3, 4, [100, 22, 33, [111, 222]]] 
 c = a.copy()=
 [1, 2, 3, 4, [100, 22, 33, [111, 222]]] 
 d = copy.deepcopy(a)
 [1, 2, 3, 4, [11, 22, 33, [111, 222]]] 
a[4][3][0] = 1000
print('after a[4][3][0] = 1000\r\n a=\r\n', a, '\r\n','b = a=\r\n', b, '\r\n','c = a.copy()=\r\n', c, '\r\n','d = copy.deepcopy(a)\r\n', d, '\r\n')
after a[4][3][0] = 1000
 a=
 [10, 2, 3, 4, [100, 22, 33, [1000, 222]]] 
 b = a=
 [10, 2, 3, 4, [100, 22, 33, [1000, 222]]] 
 c = a.copy()=
 [1, 2, 3, 4, [100, 22, 33, [1000, 222]]] 
 d = copy.deepcopy(a)
 [1, 2, 3, 4, [11, 22, 33, [111, 222]]] 

2、numpy中的浅拷贝和深拷贝

a1 = np.random.randn(2, 3)
b1 = a1
c1 = a1.copy()
d1 = copy.deepcopy(a1)print('before modify\r\n a1=\r\n', a1, '\r\n','b1 = a1=\r\n', b1, '\r\n','c1 = a1.copy()=\r\n', c1, '\r\n','d1 = copy.deepcopy(a1)=\r\n', d1, '\r\n')a1[0] = 10
print('after a1[0] = 10\r\n a1=\r\n', a1, '\r\n','b1 = a1=\r\n', b1, '\r\n','c1 = a1.copy()=\r\n', c1, '\r\n','d1 = copy.deepcopy(a1)=\r\n', d1, '\r\n')

before modify
 a1=
 [[ 1.48618757 -0.90409826  2.05529475]
 [ 0.14232255  2.93331428  0.88511785]] 
 b1 = a1=
 [[ 1.48618757 -0.90409826  2.05529475]
 [ 0.14232255  2.93331428  0.88511785]] 
 c1 = a1.copy()=
 [[ 1.48618757 -0.90409826  2.05529475]
 [ 0.14232255  2.93331428  0.88511785]] 
 d1 = copy.deepcopy(a1)=
 [[ 1.48618757 -0.90409826  2.05529475]
 [ 0.14232255  2.93331428  0.88511785]] 
after a1[0] = 10
 a1=
 [[10.         10.         10.        ]
 [ 0.14232255  2.93331428  0.88511785]] 
 b1 = a1=
 [[10.         10.         10.        ]
 [ 0.14232255  2.93331428  0.88511785]] 
 c1 = a1.copy()=
 [[ 1.48618757 -0.90409826  2.05529475]
 [ 0.14232255  2.93331428  0.88511785]] 
 d1 = copy.deepcopy(a1)=
 [[ 1.48618757 -0.90409826  2.05529475]
 [ 0.14232255  2.93331428  0.88511785]] 

  3、pytorch中的浅拷贝和深拷贝

a2 = torch.randn(2, 3)
b2 = torch.Tensor(a2)
bb2 = torch.tensor(a2)
c2 = a2.detach()
cc2 = a2.clone()
ccc2 = a2.clone().detach()
print('before modify\r\n a2=\r\n', a2, '\r\n','b2 = torch.Tensor(a2)=\r\n', b2, '\r\n','bb2 = torch.tensor(a2)=\r\n', bb2, '\r\n','c2 = a2.detach()=\r\n', c2, '\r\n','cc2 = a2.clone()=\r\n', cc2, '\r\n','ccc2 = a2.clone().detach()=\r\n', ccc2)
a2[0] = 0
print('after modify\r\n a2=\r\n', a2, '\r\n','b2 = torch.Tensor(a2)=\r\n', b2, '\r\n','bb2 = torch.tensor(a2)=\r\n', bb2, '\r\n','c2 = a2.detach()=\r\n', c2, '\r\n','cc2 = a2.clone()=\r\n', cc2, '\r\n','ccc2 = a2.clone().detach()=\r\n', ccc2)

before modify
 a2=
 tensor([[-0.6472,  1.3437, -0.3386],
        [ 0.8979, -0.4158,  1.1338]]) 
 b2 = torch.Tensor(a2)=
 tensor([[-0.6472,  1.3437, -0.3386],
        [ 0.8979, -0.4158,  1.1338]]) 
 bb2 = torch.tensor(a2)=
 tensor([[-0.6472,  1.3437, -0.3386],
        [ 0.8979, -0.4158,  1.1338]]) 
 c2 = a2.detach()=
 tensor([[-0.6472,  1.3437, -0.3386],
        [ 0.8979, -0.4158,  1.1338]]) 
 cc2 = a2.clone()=
 tensor([[-0.6472,  1.3437, -0.3386],
        [ 0.8979, -0.4158,  1.1338]]) 
 ccc2 = a2.clone().detach()=
 tensor([[-0.6472,  1.3437, -0.3386],
        [ 0.8979, -0.4158,  1.1338]])
after modify
 a2=
 tensor([[ 0.0000,  0.0000,  0.0000],
        [ 0.8979, -0.4158,  1.1338]]) 
 b2 = torch.Tensor(a2)=
 tensor([[ 0.0000,  0.0000,  0.0000],
        [ 0.8979, -0.4158,  1.1338]]) 
 bb2 = torch.tensor(a2)=
 tensor([[-0.6472,  1.3437, -0.3386],
        [ 0.8979, -0.4158,  1.1338]]) 
 c2 = a2.detach()=
 tensor([[ 0.0000,  0.0000,  0.0000],
        [ 0.8979, -0.4158,  1.1338]]) 
 cc2 = a2.clone()=
 tensor([[-0.6472,  1.3437, -0.3386],
        [ 0.8979, -0.4158,  1.1338]]) 
 ccc2 = a2.clone().detach()=
 tensor([[-0.6472,  1.3437, -0.3386],
        [ 0.8979, -0.4158,  1.1338]])

 参考

1、B站视频

十分钟!彻底弄懂Python深拷贝与浅拷贝机制_哔哩哔哩_bilibili

11、简书

NumPy之浅拷贝和深拷贝 - 简书 (jianshu.com)

2、CSDN-numpy

 numpy copy(无拷贝 浅拷贝、深拷贝)类型说明_genghaihua的博客-CSDN博客

3、CSDN-pytorch

python、pytorch中的常见的浅拷贝、深拷贝问题总结_pytorch tensor的浅、复制_新嬉皮士的博客-CSDN博客

完整代码

import numpy as np
import copy
import torcha = [1, 2, 3, 4, [11, 22, 33, [111, 222]]]
b = a
c = a.copy()
d = copy.deepcopy(a)print('before modify\r\n a=\r\n', a, '\r\n','b = a=\r\n', b, '\r\n','c = a.copy()=\r\n', c, '\r\n','d = copy.deepcopy(a)\r\n', d, '\r\n')a[0] = 10
print('after a[0] = 10\r\n a=\r\n', a, '\r\n','b = a=\r\n', b, '\r\n','c = a.copy()=\r\n', c, '\r\n','d = copy.deepcopy(a)\r\n', d, '\r\n')a[4][0] = 100
print('after a[4][0] = 100\r\n a=\r\n', a, '\r\n','b = a=\r\n', b, '\r\n','c = a.copy()=\r\n', c, '\r\n','d = copy.deepcopy(a)\r\n', d, '\r\n')a[4][3][0] = 1000
print('after a[4][3][0] = 1000\r\n a=\r\n', a, '\r\n','b = a=\r\n', b, '\r\n','c = a.copy()=\r\n', c, '\r\n','d = copy.deepcopy(a)\r\n', d, '\r\n')a1 = np.random.randn(2, 3)
b1 = a1
c1 = a1.copy()
d1 = copy.deepcopy(a1)print('before modify\r\n a1=\r\n', a1, '\r\n','b1 = a1=\r\n', b1, '\r\n','c1 = a1.copy()=\r\n', c1, '\r\n','d1 = copy.deepcopy(a1)=\r\n', d1, '\r\n')a1[0] = 10
print('after a1[0] = 10\r\n a1=\r\n', a1, '\r\n','b1 = a1=\r\n', b1, '\r\n','c1 = a1.copy()=\r\n', c1, '\r\n','d1 = copy.deepcopy(a1)=\r\n', d1, '\r\n')a2 = torch.randn(2, 3)
b2 = torch.Tensor(a2)
bb2 = torch.tensor(a2)
c2 = a2.detach()
cc2 = a2.clone()
ccc2 = a2.clone().detach()
print('before modify\r\n a2=\r\n', a2, '\r\n','b2 = torch.Tensor(a2)=\r\n', b2, '\r\n','bb2 = torch.tensor(a2)=\r\n', bb2, '\r\n','c2 = a2.detach()=\r\n', c2, '\r\n','cc2 = a2.clone()=\r\n', cc2, '\r\n','ccc2 = a2.clone().detach()=\r\n', ccc2)
a2[0] = 0
print('after a2[0] = 0\r\n a2=\r\n', a2, '\r\n','b2 = torch.Tensor(a2)=\r\n', b2, '\r\n','bb2 = torch.tensor(a2)=\r\n', bb2, '\r\n','c2 = a2.detach()=\r\n', c2, '\r\n','cc2 = a2.clone()=\r\n', cc2, '\r\n','ccc2 = a2.clone().detach()=\r\n', ccc2)

http://www.lryc.cn/news/131500.html

相关文章:

  • EasyRecovery14数据恢复软件支持各类存储设备的数据恢复
  • 玩机搞机----面具模块的组成 制作模块
  • 注册中心/配置管理 —— SpringCloud Consul
  • Next.js 13 你需要了解的 8 件事
  • 计数排序(Count Sort)算法详解
  • Linux驱动开发(Day3)
  • 使用Vscode调试shell脚本
  • OpenAI Function calling
  • 【C语言】字符分类函数、字符转换函数、内存函数
  • Deep Learning With Pytorch - 最基本的感知机、贯序模型/分类、拟合
  • 测试工具coverage的高阶使用
  • 安卓监听端口接收消息
  • 「Node」下载安装配置node.js
  • NOIP2014普及组,提高组 比例简化 飞扬的小鸟 答案
  • 【Java】使用Apache POI识别PPT中的图片和文字,以及对应的大小、坐标、颜色、字体等
  • 根据源码,模拟实现 RabbitMQ - 实现消息持久化,统一硬盘操作(3)
  • 找到所有数组中消失的数(C语言详解)
  • 计算机毕设项目之基于django+mysql的疫情实时监控大屏系统(前后全分离)
  • Unity UI内存泄漏优化
  • 学习笔记:Opencv实现图像特征提取算法SIFT
  • 【golang】接口类型(interface)使用和原理
  • 【Linux操作系统】Linux系统编程中的共享存储映射(mmap)
  • 2235.两整数相加:19种语言解法(力扣全解法)
  • 中国剩余定理及扩展
  • 数据在内存中的存储(deeper)
  • 算法修炼Day52|● 300.最长递增子序列 ● 674. 最长连续递增序列 ● 718. 最长重复子数组
  • 使用 HTML、CSS 和 JavaScript 创建实时 Web 编辑器
  • 百望云联合华为发布票财税链一体化数智解决方案 赋能企业数字化升级
  • 实现两个栈模拟队列
  • 无涯教程-TensorFlow - 单词嵌入