基于邻层数据匹配的工业 CT 图像生成 G 代码方法
DOI:
作者:
作者单位:

作者简介:

通讯作者:

中图分类号:

TP391. 9 TH74

基金项目:

国家重大科学仪器设备开发专项(2013YQ030629)资助


A G code generation method of industrial CT image based on data matching between layers
Author:
Affiliation:

Fund Project:

  • 摘要
  • |
  • 图/表
  • |
  • 访问统计
  • |
  • 参考文献
  • |
  • 相似文献
  • |
  • 引证文献
  • |
  • 资源附件
  • |
  • 文章评论
    摘要:

    针对目前工业 CT 图像转换为 3D 打印 G 代码方法效率低的问题,提出一种基于邻层数据匹配的工业 CT 图像直接转换 成 G 代码的方法。 首先采用 Canny 算子提取工业 CT 图像的轮廓,然后处理轮廓分叉问题,实现邻层间几何信息数据匹配,其 次进行邻层间轮廓插值以满足 3D 打印层间厚度要求,从而避免“阶梯效应”,最后通过填充编码得到用于 3D 打印的 G 代码。 使用本文提出的方法,轮毂 CT 图像转换为 G 代码的时间为 10. 5 s,耗时远小于其他间接转换方法;3D 打印出的轮毂无“阶梯效 应”,平均尺寸误差率为 0. 25% 。 实验结果表明,该方法不涉及中间格式,转换效率高,转换误差与传统方法相当,适用于具有 复杂内腔结构的零件。

    Abstract:

    The efficiency of traditional industrial CT image conversion to 3D printing G-code is low. To address this issue, a fast conversion method of industrial CT image to G-code based on the adjacent layer data matching is proposed. Firstly, the Canny operator is used to extract the contour edge of industrial CT image. Secondly, the contour bifurcation problem is processed to realize the geometric information data matching between adjacent layers. Thirdly, the contour interpolation between adjacent layers is carried out to meet the requirements of 3D printing layer thickness, so as to avoid the “ladder effect”. Finally, the G code for 3D printing is achieved by the filling coding. By using the proposed method, it takes 10. 5 s to convert the wheel CT image into G code, which is much less than other indirect conversion methods. There is no “ladder effect” in the 3D printed wheel hub, and the average dimension error rate is 0. 25% . Experimental results show that the conversion method does not involve intermediate format, and has high conversion efficiency. The conversion error is equivalent to the traditional method, which is suitable for parts with complex cavity structure.

    参考文献
    相似文献
    引证文献
引用本文

谭川东,何泳江,罗雪清,方 诚,段黎明.基于邻层数据匹配的工业 CT 图像生成 G 代码方法[J].仪器仪表学报,2021,(4):265-274

复制
分享
文章指标
  • 点击次数:
  • 下载次数:
  • HTML阅读次数:
  • 引用次数:
历史
  • 收稿日期:
  • 最后修改日期:
  • 录用日期:
  • 在线发布日期: 2023-06-28
  • 出版日期: