当前位置: 首页 > wzjs >正文

南京网站定制拉新任务接单放单平台

南京网站定制,拉新任务接单放单平台,九曲网站建设,山西钢铁建设集团有限公司网站文章目录 什么是CBCTCBCT技术路线使用第三方工具使用Python实现使用前端实现 纯前端实现方案优缺点使用VolView实现CBCT VolView的使用1.克隆代码2.配置依赖3.运行4.效果 进阶:VolView配合Python解决卡顿1.修改VtkThreeView.vue2.新增Custom3DView.vue3.Python生成s…

文章目录

  • 什么是CBCT
  • CBCT技术路线
    • 使用第三方工具
    • 使用Python实现
    • 使用前端实现
  • 纯前端实现方案优缺点
    • 使用VolView实现CBCT
  • VolView的使用
    • 1.克隆代码
    • 2.配置依赖
    • 3.运行
    • 4.效果
  • 进阶:VolView配合Python解决卡顿
    • 1.修改VtkThreeView.vue
    • 2.新增Custom3DView.vue
    • 3.Python生成stl三维文件
    • 4.最终效果

什么是CBCT

放射科影像是医学软件必不可少的一部分,对影像的显示、编辑、处理等操作更是重点。在多种放射科影像中,CBCT是关键的一环。CBCT全称为口腔颌面锥形束CT,其工作原理是通过锥形X射线束围绕患者头部旋转扫描,结合计算机算法生成高分辨率的三维图像。

CBCT在口腔医学中几乎覆盖所有亚专科:
​种植牙:评估颌骨密度、神经管位置,辅助种植体定位和手术导板设计。
​正畸与阻生牙:观察牙齿排列、埋伏牙位置及与周围组织关系,减少拔牙风险。
​牙体牙髓治疗:诊断复杂根管、根裂及根尖病变,提高治疗精确性。
​颌面外科:用于肿瘤、骨折的术前评估及术后效果监测。
​颞下颌关节:清晰显示关节结构异常,辅助诊断关节紊乱病。>

CBCT技术路线

使用第三方工具

有不少工具可以实现CBCT效果,例如:Slicer。
在这里插入图片描述

使用Python实现

使用pydicom结合mpl_toolkits实现三维展示。

使用前端实现

使用VTK.js、Three.js及WebAssembly 实现。

方案技术适用场景优缺点
VTK.jsWebGL + VTK医学影像可视化(CT/MRI)高质量、原生支持 Volume Rendering,但数据转换复杂
Three.js + 3D 纹理WebGL + Shader一般 3D 可视化兼容性好,适合前端开发,但医学精度低
WebAssembly + 医学引擎WASM专业医学影像专业级医学软件,性能强,但开发难度大

纯前端实现方案优缺点

类别优点缺点
性能与成本1. 低服务器依赖,节省硬件和维护成本。
2. 实时交互,响应延迟低。
1. 浏览器内存限制大,可能崩溃。
2. 低端设备GPU性能不足,导致渲染卡顿。
数据隐私1. 数据无需上传服务器,符合隐私法规(如HIPAA)。
2. 离线缓存支持断网使用。
1. 数据预处理依赖后端,可能需临时暴露敏感信息。
功能与兼容性1. 支持基础三维操作(旋转、缩放、剖面切割)。1. 复杂算法(如深度学习分割)难以实现。
2. 浏览器兼容性有限(如旧版Safari)。
开发与部署1. 部署便捷,前端静态资源可托管至CDN。
2. 适合轻量级应用(教育、预览)。
1. 大规模数据加载耗时(如全头颅CBCT)。
2. 需额外优化压缩和分块加载逻辑。

使用VolView实现CBCT

VolView是一款基于VTK.js的开源医学影像浏览器,支持在网页端直接拖拽加载DICOM数据并生成2D切片及3D电影级体渲染视图,提供标注、测量等工具,所有数据均在本地处理,确保隐私安全。无需安装软件,可跨平台使用,适用于临床诊断与科研教育。
在这里插入图片描述

官网:https://volview.kitware.com/

关于VolView的介绍可以参考视频:

VolView的使用

1.克隆代码

GitHub地址:https://github.com/Kitware/VolView

2.配置依赖

npm i

3.运行

npm run dev

4.效果

在这里插入图片描述
注:点击左侧可以在线加载演示数据,也可以点击右侧上传本地dicom影像文件。

在这里插入图片描述

进阶:VolView配合Python解决卡顿

上文我们提到,基于vtk.js的纯前端CBCT解决方案,虽然能不依赖其他第三方软件的情况下显示出我们需要的效果,但它对性能的高要求导致打开前端的电脑必须有较高的GPU配置,否则将异常卡顿。

此处给出思路:

由于卡顿主要是三维显示导致,其代码需实时计算得出三维效果,导致浏览器卡顿。要解决卡顿,我们就需要解决三维显示问题。

我们可以将VolView的三维渲染部分替换为server端Python生成stl文件。

1.修改VtkThreeView.vue

在这里插入图片描述
去除原有渲染三维的组件,改为我们自定义的新组件: <Custom3DView />

全部代码如下:

<template><div class="vtk-container-wrapper vtk-three-container"><div class="vtk-container" :class="active ? 'active' : ''"><!-- 此处是绘制3D重建的地方 start--><div class="vtk-sub-container"><!-- <divclass="vtk-view"ref="vtkContainerRef"data-testid="vtk-view vtk-three-view"></div> --><Custom3DView /></div><!-- 此处是绘制3D重建的地方 end --><div class="overlay-no-events tool-layer"><crop-tool :view-id="viewID" /><pan-tool :viewId="viewID" /></div><view-overlay-grid class="overlay-no-events view-annotations"><template v-slot:top-left><div class="annotation-cell"><v-btnclass="pointer-events-all"darkiconsize="medium"variant="text"@click="resetCamera"><v-icon size="medium" class="py-1">mdi-camera-flip-outline</v-icon><v-tooltiplocation="right"activator="parent"transition="slide-x-transition">Reset Camera</v-tooltip></v-btn><span class="ml-3">{{ topLeftLabel }}</span></div></template></view-overlay-grid><transition name="loading"><div v-if="isImageLoading" class="overlay-no-events loading"><div>Loading the image</div><div><v-progress-circular indeterminate color="blue" /></div></div></transition></div></div>
</template><script lang="ts">
import {computed,defineComponent,onBeforeUnmount,onMounted,PropType,provide,ref,toRefs,watch,Ref,nextTick,
} from 'vue';
import { computedWithControl } from '@vueuse/core';
import { vec3 } from 'gl-matrix';import vtkVolumeRepresentationProxy from '@kitware/vtk.js/Proxy/Representations/VolumeRepresentationProxy';
import { Mode as LookupTableProxyMode } from '@kitware/vtk.js/Proxy/Core/LookupTableProxy';
import vtkPiecewiseFunctionProxy from '@kitware/vtk.js/Proxy/Core/PiecewiseFunctionProxy';
import vtkVolumeMapper from '@kitware/vtk.js/Rendering/Core/VolumeMapper';
import vtkImageData from '@kitware/vtk.js/Common/DataModel/ImageData';
import { getDiagonalLength } from '@kitware/vtk.js/Common/DataModel/BoundingBox';
import type { Vector3 } from '@kitware/vtk.js/types';import { useProxyManager } from '@/src/composables/useProxyManager';
import ViewOverlayGrid from '@/src/components/ViewOverlayGrid.vue';
import { useResizeObserver } from '../composables/useResizeObserver';
import { useCurrentImage } from '../composables/useCurrentImage';
import { useCameraOrientation } from '../composables/useCameraOrientation';
import vtkLPSView3DProxy from '../vtk/LPSView3DProxy';
import { useSceneBuilder } from '../composables/useSceneBuilder';
import { usePersistCameraConfig } from '../composables/usePersistCameraConfig';
import { useModelStore } from '../store/datasets-models';
import { LPSAxisDir } from '../types/lps';
import { useViewProxy } from '../composables/useViewProxy';
import { ViewProxyType } from '../core/proxies';
import { VolumeColorConfig } from '../store/view-configs/types';
import useVolumeColoringStore, {DEFAULT_AMBIENT,DEFAULT_DIFFUSE,DEFAULT_SPECULAR,
} from '../store/view-configs/volume-coloring';
import { getShiftedOpacityFromPreset } from '../utils/vtk-helpers';
import CropTool from './tools/crop/CropTool.vue';
import PanTool from './tools/PanTool.vue';
import { useWidgetManager } from '../composables/useWidgetManager';
import { VTKThreeViewWidgetManager } from '../constants';
import { useCropStore, croppingPlanesEqual } from '../store/tools/crop';
import { isViewAnimating } from '../composables/isViewAnimating';
import { ColoringConfig } from '../types/views';
import useViewCameraStore from '../store/view-configs/camera';
import { Maybe } from '../types';
import { useResetViewsEvents } from './tools/ResetViews.vue';
import Custom3DView from '@/src/components/Custom3DView.vue';function useCvrEffect(config: Ref<Maybe<VolumeColorConfig>>,imageRep: Ref<vtkVolumeRepresentationProxy | null>,viewProxy: Ref<vtkLPSView3DProxy>
) {const cvrParams = computed(() => config.value?.cvr);const repMapper = computedWithControl(imageRep,() => imageRep.value?.getMapper() as vtkVolumeMapper | undefined);const image = computedWithControl(imageRep,() => imageRep.value?.getInputDataSet() as vtkImageData | null | undefined);const volume = computedWithControl(imageRep,() => imageRep.value?.getVolumes()[0]);const renderer = computed(() => viewProxy.value.getRenderer());const isAnimating = isViewAnimating(viewProxy);const cvrEnabled = computed(() => {const enabled = !!cvrParams.value?.enabled;const animating = isAnimating.value;return enabled && !animating;});const requestRender = () => {if (!isAnimating.value) {viewProxy.value.renderLater();}};// lightsconst volumeCenter = computed(() => {if (!volume.value) return null;const volumeBounds = volume.value.getBounds();return [(volumeBounds[0] + volumeBounds[1]) / 2,(volumeBounds[2] + volumeBounds[3]) / 2,(volumeBounds[4] + volumeBounds[5]) / 2,] as Vector3;});const lightFollowsCamera = computed(() => cvrParams.value?.lightFollowsCamera ?? true);watch([volumeCenter, renderer, cvrEnabled, lightFollowsCamera],([center, ren, enabled, lightFollowsCamera_]) => {if (!center) return;if (ren.getLights().length === 0) {ren.createLight();}const light = ren.getLights()[0];if (enabled) {light.setFocalPoint(...center);light.setColor(1, 1, 1);light.setIntensity(1);light.setConeAngle(90);light.setPositional(true);ren.setTwoSidedLighting(false);if (lightFollowsCamera_) {light.setLightTypeToHeadLight();ren.updateLightsGeometryToFollowCamera();} else {light.setLightTypeToSceneLight();}} else {light.setPositional(false);}requestRender();},{ immediate: true });// sampling distanceconst volumeQuality = computed(() => cvrParams.value?.volumeQuality);watch([volume, image, repMapper, volumeQuality, cvrEnabled, isAnimating],([volume_, image_, mapper, volumeQuality_, enabled, animating]) => {if (!volume_ || !mapper || volumeQuality_ == null || !image_) return;if (animating) {mapper.setSampleDistance(0.75);mapper.setMaximumSamplesPerRay(1000);mapper.setGlobalIlluminationReach(0);mapper.setComputeNormalFromOpacity(false);} else {const dims = image_.getDimensions();const spacing = image_.getSpacing();const spatialDiagonal = vec3.length(vec3.fromValues(dims[0] * spacing[0],dims[1] * spacing[1],dims[2] * spacing[2]));// Use the average spacing for sampling by defaultlet sampleDistance = spacing.reduce((a, b) => a + b) / 3.0;// Adjust the volume sampling by the quality slider valuesampleDistance /= volumeQuality_ > 1 ? 0.5 * volumeQuality_ ** 2 : 1.0;const samplesPerRay = spatialDiagonal / sampleDistance + 1;mapper.setMaximumSamplesPerRay(samplesPerRay);mapper.setSampleDistance(sampleDistance);// Adjust the global illumination reach by volume quality slidermapper.setGlobalIlluminationReach(enabled ? 0.25 * volumeQuality_ : 0);mapper.setComputeNormalFromOpacity(!enabled && volumeQuality_ > 2);}requestRender();},{ immediate: true });// volume propertiesconst ambient = computed(() => cvrParams.value?.ambient ?? 0);const diffuse = computed(() => cvrParams.value?.diffuse ?? 0);const specular = computed(() => cvrParams.value?.specular ?? 0);watch([volume, image, ambient, diffuse, specular, cvrEnabled],([volume_, image_, ambient_, diffuse_, specular_, enabled]) => {if (!volume_ || !image_) return;const property = volume_.getProperty();property.setScalarOpacityUnitDistance(0,(0.5 * getDiagonalLength(image_.getBounds())) /Math.max(...image_.getDimensions()));property.setShade(true);property.setUseGradientOpacity(0, !enabled);property.setGradientOpacityMinimumValue(0, 0.0);const dataRange = image_.getPointData().getScalars().getRange();property.setGradientOpacityMaximumValue(0,(dataRange[1] - dataRange[0]) * 0.01);property.setGradientOpacityMinimumOpacity(0, 0.0);property.setGradientOpacityMaximumOpacity(0, 1.0);// do not toggle these parameters when animatingproperty.setAmbient(enabled ? ambient_ : DEFAULT_AMBIENT);property.setDiffuse(enabled ? diffuse_ : DEFAULT_DIFFUSE);property.setSpecular(enabled ? specular_ : DEFAULT_SPECULAR);requestRender();},{ immediate: true });// volumetric scattering blendingconst useVolumetricScatteringBlending = computed(() => cvrParams.value?.useVolumetricScatteringBlending ?? false);const volumetricScatteringBlending = computed(() => cvrParams.value?.volumetricScatteringBlending ?? 0);watch([useVolumetricScatteringBlending,volumetricScatteringBlending,repMapper,cvrEnabled,],([useVsb, vsb, mapper, enabled]) => {if (!mapper) return;if (enabled && useVsb) {mapper.setVolumetricScatteringBlending(vsb);} else {mapper.setVolumetricScatteringBlending(0);}requestRender();},{ immediate: true });// local ambient occlusionconst useLocalAmbientOcclusion = computed(() => cvrParams.value?.useLocalAmbientOcclusion ?? false);const laoKernelSize = computed(() => cvrParams.value?.laoKernelSize ?? 0);const laoKernelRadius = computed(() => cvrParams.value?.laoKernelRadius ?? 0);watch([useLocalAmbientOcclusion,laoKernelSize,laoKernelRadius,repMapper,cvrEnabled,],([useLao, kernelSize, kernelRadius, mapper, enabled]) => {if (!mapper) return;if (enabled && useLao) {mapper.setLocalAmbientOcclusion(true);mapper.setLAOKernelSize(kernelSize);mapper.setLAOKernelRadius(kernelRadius);} else {mapper.setLocalAmbientOcclusion(false);mapper.setLAOKernelSize(0);mapper.setLAOKernelRadius(0);}requestRender();},{ immediate: true });
}function useColoringEffect(config: Ref<Maybe<ColoringConfig>>,imageRep: Ref<vtkVolumeRepresentationProxy | null>,viewProxy: Ref<vtkLPSView3DProxy>
) {const colorBy = computed(() => config.value?.colorBy);const colorTransferFunction = computed(() => config.value?.transferFunction);const opacityFunction = computed(() => config.value?.opacityFunction);const proxyManager = useProxyManager();watch([imageRep, colorBy, colorTransferFunction, opacityFunction],([rep, colorBy_, colorFunc, opacityFunc]) => {if (!rep || !colorBy_ || !colorFunc || !opacityFunc || !proxyManager) {return;}const { arrayName, location } = colorBy_;const lut = proxyManager.getLookupTable(arrayName);lut.setMode(LookupTableProxyMode.Preset);lut.setPresetName(colorFunc.preset);lut.setDataRange(...colorFunc.mappingRange);const pwf = proxyManager.getPiecewiseFunction(arrayName);pwf.setMode(opacityFunc.mode);pwf.setDataRange(...opacityFunc.mappingRange);switch (opacityFunc.mode) {case vtkPiecewiseFunctionProxy.Mode.Gaussians:pwf.setGaussians(opacityFunc.gaussians);break;case vtkPiecewiseFunctionProxy.Mode.Points: {const opacityPoints = getShiftedOpacityFromPreset(opacityFunc.preset,opacityFunc.mappingRange,opacityFunc.shift,opacityFunc.shiftAlpha);if (opacityPoints) {pwf.setPoints(opacityPoints);}break;}case vtkPiecewiseFunctionProxy.Mode.Nodes:pwf.setNodes(opacityFunc.nodes);break;default:}if (rep) {// control color range manuallyrep.setRescaleOnColorBy(false);rep.setColorBy(arrayName, location);}// Need to trigger a render for when we are restoring from a state fileviewProxy.value.renderLater();},{ immediate: true });
}export default defineComponent({props: {id: {type: String,required: true,},viewDirection: {type: String as PropType<LPSAxisDir>,required: true,},viewUp: {type: String as PropType<LPSAxisDir>,required: true,},},components: {ViewOverlayGrid,CropTool,PanTool,Custom3DView,},setup(props) {const modelStore = useModelStore();const volumeColoringStore = useVolumeColoringStore();const viewCameraStore = useViewCameraStore();const { id: viewID, viewDirection, viewUp } = toRefs(props);const vtkContainerRef = ref<HTMLElement>();// --- computed vars --- //const {currentImageID: curImageID,currentImageMetadata: curImageMetadata,currentImageData,isImageLoading,} = useCurrentImage();// --- view proxy setup --- //const { viewProxy, setContainer: setViewProxyContainer } =useViewProxy<vtkLPSView3DProxy>(viewID, ViewProxyType.Volume);onMounted(() => {viewProxy.value.setOrientationAxesVisibility(true);viewProxy.value.setOrientationAxesType('cube');viewProxy.value.setBackground([0, 0, 0, 0]);setViewProxyContainer(vtkContainerRef.value);});onBeforeUnmount(() => {setViewProxyContainer(null);viewProxy.value.setContainer(null);});useResizeObserver(vtkContainerRef, () => viewProxy.value.resize());// --- scene setup --- //const { baseImageRep } = useSceneBuilder<vtkVolumeRepresentationProxy>(viewID,{baseImage: curImageID,models: computed(() => modelStore.idList),});// --- picking --- //// disables picking for crop control and morewatch(baseImageRep,(rep) => {if (rep) {rep.getVolumes().forEach((volume) => volume.setPickable(false));}},{ immediate: true });// --- widget manager --- //const { widgetManager } = useWidgetManager(viewProxy);provide(VTKThreeViewWidgetManager, widgetManager);// --- camera setup --- //const { cameraUpVec, cameraDirVec } = useCameraOrientation(viewDirection,viewUp,curImageMetadata);const resetCamera = () => {const bounds = curImageMetadata.value.worldBounds;const center = [(bounds[0] + bounds[1]) / 2,(bounds[2] + bounds[3]) / 2,(bounds[4] + bounds[5]) / 2,] as vec3;viewProxy.value.updateCamera(cameraDirVec.value,cameraUpVec.value,center);viewProxy.value.resetCamera();viewProxy.value.renderLater();};watch([baseImageRep, cameraDirVec, cameraUpVec],() => {const cameraConfig = viewCameraStore.getConfig(viewID.value,curImageID.value);// We don't want to reset the camera if we have a config we are restoringif (!cameraConfig) {// nextTick ensures resetCamera gets called after// useSceneBuilder refreshes the scene.nextTick(resetCamera);}},{immediate: true,});const { restoreCameraConfig } = usePersistCameraConfig(viewID,curImageID,viewProxy,'position','focalPoint','directionOfProjection','viewUp');watch(curImageID, () => {// See if we have a camera configuration to restoreconst cameraConfig = viewCameraStore.getConfig(viewID.value,curImageID.value);if (cameraConfig) {restoreCameraConfig(cameraConfig);viewProxy.value.getRenderer().resetCameraClippingRange();viewProxy.value.renderLater();}});// --- coloring setup --- //const volumeColorConfig = computed(() =>volumeColoringStore.getConfig(viewID.value, curImageID.value));watch([viewID, curImageID],() => {if (curImageID.value &&currentImageData.value &&!volumeColorConfig.value) {volumeColoringStore.resetToDefaultColoring(viewID.value,curImageID.value,currentImageData.value);}},{ immediate: true });// --- CVR parameters --- //useCvrEffect(volumeColorConfig, baseImageRep, viewProxy);// --- coloring --- //useColoringEffect(volumeColorConfig, baseImageRep, viewProxy);// --- cropping planes --- //const cropStore = useCropStore();const croppingPlanes = cropStore.getComputedVTKPlanes(curImageID);watch(croppingPlanes,(planes, oldPlanes) => {const mapper = baseImageRep.value?.getMapper();if (!mapper ||!planes ||(oldPlanes && croppingPlanesEqual(planes, oldPlanes)))return;mapper.removeAllClippingPlanes();planes.forEach((plane) => mapper.addClippingPlane(plane));mapper.modified();viewProxy.value.renderLater();},{ immediate: true });// --- Listen to ResetViews event --- //const events = useResetViewsEvents();events.onClick(() => resetCamera());// --- template vars --- //return {vtkContainerRef,viewID,active: false,topLeftLabel: computed(() =>volumeColorConfig.value?.transferFunction.preset.replace(/-/g, ' ') ??''),isImageLoading,resetCamera,};},
});
</script><style scoped>
.model-container {width: 100%;height: 600px;position: relative;
}
</style><style scoped src="@/src/components/styles/vtk-view.css"></style>
<style scoped src="@/src/components/styles/utils.css"></style><style scoped>
.vtk-three-container {background-color: black;grid-template-columns: auto;
}
</style>

2.新增Custom3DView.vue

在这里插入图片描述

src/components目录下新增Custom3DView.vue。用来显示后端Python生成的stl。

全部代码如下:

<template><div ref="container" class="model-container"></div>
</template><script>
import * as THREE from 'three';
import { OrbitControls } from 'three/examples/jsm/controls/OrbitControls';
import { STLLoader } from 'three/examples/jsm/loaders/STLLoader';
import { toRaw } from 'vue';export default {data() {return {loadingProgress: 0,loadError: null,animateId: null};},mounted() {this.initThreeContext();this.loadSTLModel();this.setupAnimation();},beforeDestroy() {this.cleanupResources();},methods: {initThreeContext() {const container = this.$refs.container;// 场景配置this._scene = new THREE.Scene();this._scene.background = new THREE.Color(0x000000);// 相机配置this._camera = new THREE.PerspectiveCamera(45, // 缩小视角增加近景效果container.clientWidth / container.clientHeight,0.1,500 // 缩小可视范围提升渲染性能); this._camera.position.set(30, 30, 30); // 初始位置更靠近模型// 渲染器配置(网页7的黑色背景方案)this._renderer = new THREE.WebGLRenderer({ antialias: true,alpha: true // 保留alpha通道以备后续扩展});this._renderer.setClearColor(0x000000, 1); // 双重确保背景颜色this._renderer.setSize(container.clientWidth, container.clientHeight);container.appendChild(this._renderer.domElement);// 光源优化const ambientLight = new THREE.AmbientLight(0x404040);const directionalLight = new THREE.DirectionalLight(0xffffff, 0.8);directionalLight.position.set(15, 15, 15);this._scene.add(ambientLight, directionalLight);// 控制器配置this._controls = new OrbitControls(toRaw(this._camera), this._renderer.domElement);this._controls.enableDamping = true;this._controls.dampingFactor = 0.05;},loadSTLModel() {const objSTLLoader=new STLLoader()objSTLLoader.crossOrigin='Anonymous'objSTLLoader.load( 'https://stl所在路径.stl', geometry => {// 添加模型前清空旧模型this.clearExistingModel();// 材质配置(浅灰色方案)const material  = new THREE.MeshPhongMaterial({color: 0xcccccc, // 浅灰色specular: 0x222222, shininess: 150, side: THREE.DoubleSide});const mesh = new THREE.Mesh(geometry, material);geometry.center();mesh.scale.set(0.1, 0.1, 0.1);// 自动聚焦模型const box = new THREE.Box3().setFromObject(mesh);const center = box.getCenter(new THREE.Vector3());toRaw(this._camera).lookAt(center);toRaw(this._scene).add(mesh); },progress => {this.loadingProgress = (progress.loaded / progress.total) * 100},error => {this.loadError = '模型加载失败,请检查网络或文件路径'});},setupAnimation() {const animate = () => {this.animateId = requestAnimationFrame(animate);toRaw(this._controls).update();this._renderer.render(toRaw(this._scene), toRaw(this._camera));};animate();},cleanupResources() {cancelAnimationFrame(this.animateId);toRaw(this._controls).dispose();this._renderer.dispose();toRaw(this._scene).traverse(obj => {if (obj.isMesh) {obj.geometry.dispose();obj.material.dispose();}});}}
};
</script><style scoped>
.model-container {width: 100%;height: 600px;position: relative;background: #000; /* 备用黑色背景 */
}
</style>

3.Python生成stl三维文件

在服务端用Python生成stl:

from pydicom import dcmread
import pylibjpegimport numpy as np
import pydicom
import pydicom.pixel_data_handlers.gdcm_handler as gdcm_handlerimport os
import matplotlib.pyplot as plt
from glob import glob
from mpl_toolkits.mplot3d.art3d import Poly3DCollection
import scipy.ndimage
from skimage import measure
from mpl_toolkits import mplot3d
from stl import mesh
import trimesh
pydicom.config.image_handlers = [None, gdcm_handler]
pydicom.config.image_handlers = ['gdcm_handler']def load_scan(path):slices = []# count = 0for s in os.listdir(path):ds = pydicom.dcmread(path + '/' + s, force=True)ds.PhotometricInterpretation = 'YBR_FULL'if s != '.DS_Store':  # This is for AttributeError: 'FileDataset' object has no attribute 'InstanceNumber'slices.append(ds)slices.sort(key=lambda x: int(x.InstanceNumber))try:slice_thickness = np.abs(slices[0].ImagePositionPatient[2] - slices[1].ImagePositionPatient[2])except:slice_thickness = np.abs(slices[0].SliceLocation - slices[1].SliceLocation)for s in slices:s.SliceThickness = slice_thicknessreturn slicesdef get_pixels_hu(scans):image = np.stack([s.pixel_array for s in scans])image = image.astype(np.int16)image[image == -2000] = 0# Convert to Hounsfield units (HU)intercept = scans[0].RescaleInterceptslope = scans[0].RescaleSlopeif slope != 1:image = slope * image.astype(np.float64)image = image.astype(np.int16)image += np.int16(intercept)return np.array(image, dtype=np.int16)def make_mesh(image, threshold=-300, step_size=1):print("Transposing surface")p = image.transpose(2, 1, 0)print("Calculating surface")verts, faces, norm, val = measure.marching_cubes(p, threshold, step_size=step_size, allow_degenerate=True)return verts, facesdef resample(image, scan, new_spacing=[1, 1, 1]):# Determine current pixel spacing, change this function to get better resultspacing = [float(scan[0].SliceThickness)] + [float(i) for i in scan[0].PixelSpacing]spacing = np.array(spacing)resize_factor = [spacing[0] / new_spacing[0], spacing[1] / new_spacing[1], spacing[2] / new_spacing[2]]new_real_shape = np.multiply(image.shape, resize_factor)new_shape = np.round(new_real_shape)real_resize_factor = new_shape / image.shapenew_spacing = spacing / real_resize_factorimage = scipy.ndimage.interpolation.zoom(image, real_resize_factor)return image, new_spacingif __name__ == "__main__":from matplotlib.cm import get_cmapimport matplotlib.colors as mcolorsdata_path = "/mnt/data_18T/data/口腔/CBCT及三维重建/dicom"output_path = "/mnt/data_18T/data/口腔/CBCT及三维重建/stl_path/"if not os.path.exists(output_path):  # create the output pathos.mkdir(output_path)patient = load_scan(data_path)images = get_pixels_hu(patient)imgs_after_resamp, spacing = resample(images.astype(np.float64), patient, [1, 0.5, 1])v, f = make_mesh(imgs_after_resamp, 350, 1)# save the stl filevertices = vfaces = f# 创建颜色列表colors = get_cmap('Greens')(np.linspace(0, 1, len(vertices)))colors = mcolors.to_rgba_array(colors)mesh = trimesh.Trimesh(vertices=vertices, faces=faces)mesh.export(output_path + 'cube2.stl', file_type="stl")

4.最终效果

在这里插入图片描述
在这里插入图片描述
注:我的是集显,配置不算高,在使用stl显示三维的情况下,很流畅。

http://www.dtcms.com/wzjs/68392.html

相关文章:

  • 多个网站备案百度统计网站
  • 织梦做导航网站网站建设黄页免费观看
  • 如何做网站的内链优化巨量引擎广告投放
  • 网站做京东联盟即时热榜
  • 网站开发的阶段流程图四年级2023新闻摘抄
  • wordpress 显示指定分类文章四川seo整站优化费用
  • 做的网站如何被百度搜到服务器域名怎么注册
  • 做网站的成本在哪关键词你们都搜什么
  • 杭州有哪些网站建设网络营销推广的优势
  • 设计单网站建设seo诊断的网络问题
  • 在凡科上做的网站无法加载出来东方网络律师团队
  • 优秀政府网站欣赏印度疫情最新消息
  • wordpress局域网建站百度小说排行榜前十名
  • 网站建设项目需求分析报告百度识图扫一扫入口
  • wordpress超级菜单seo推荐
  • 搭建舞台网站标题优化排名
  • 做静态网站的开题报告网站开发公司排名
  • 怎么自己做网站吗北京搜索关键词优化
  • 杭州seo平台seo自动发布外链工具
  • 广西南宁网站建设网络推广有哪些
  • 长春网站推广品牌seo推广咨询
  • 网站建设与管理题深圳网站建设运营
  • 类似微分销的平台福州关键词排名优化
  • 回收网站建设如何设计企业网站
  • 如何做商业网站分析重庆关键词排名首页
  • 笑话网站php程序谷歌推广公司哪家好
  • 广西两学一做考试网站打广告
  • 北京新浪网站制作公司百度关键词推广方案
  • 南昌做网站哪家便宜外包网
  • 长宁区网站建设开私域营销