2024/10/24 21:20:09 - mmengine - INFO - ------------------------------------------------------------ System environment: sys.platform: linux Python: 3.9.5 (default, Jun 4 2021, 12:28:51) [GCC 7.5.0] CUDA available: True MUSA available: False numpy_random_seed: 629861339 GPU 0,1,2,3: Tesla V100-SXM2-32GB CUDA_HOME: /usr/local/cuda NVCC: Cuda compilation tools, release 11.8, V11.8.89 GCC: gcc (GCC) 7.3.1 20180303 (Red Hat 7.3.1-5) PyTorch: 2.1.2+cu118 PyTorch compiling details: PyTorch built with: - GCC 9.3 - C++ Version: 201703 - Intel(R) oneAPI Math Kernel Library Version 2022.2-Product Build 20220804 for Intel(R) 64 architecture applications - Intel(R) MKL-DNN v3.1.1 (Git Hash 64f6bcbcbab628e96f33a62c3e975f8535a7bde4) - OpenMP 201511 (a.k.a. OpenMP 4.5) - LAPACK is enabled (usually provided by MKL) - NNPACK is enabled - CPU capability usage: AVX512 - CUDA Runtime 11.8 - NVCC architecture flags: -gencode;arch=compute_50,code=sm_50;-gencode;arch=compute_60,code=sm_60;-gencode;arch=compute_70,code=sm_70;-gencode;arch=compute_75,code=sm_75;-gencode;arch=compute_80,code=sm_80;-gencode;arch=compute_86,code=sm_86;-gencode;arch=compute_37,code=sm_37;-gencode;arch=compute_90,code=sm_90 - CuDNN 8.7 - Magma 2.6.1 - Build settings: BLAS_INFO=mkl, BUILD_TYPE=Release, CUDA_VERSION=11.8, CUDNN_VERSION=8.7.0, CXX_COMPILER=/opt/rh/devtoolset-9/root/usr/bin/c++, CXX_FLAGS= -D_GLIBCXX_USE_CXX11_ABI=0 -fabi-version=11 -fvisibility-inlines-hidden -DUSE_PTHREADPOOL -DNDEBUG -DUSE_KINETO -DLIBKINETO_NOROCTRACER -DUSE_FBGEMM -DUSE_QNNPACK -DUSE_PYTORCH_QNNPACK -DUSE_XNNPACK -DSYMBOLICATE_MOBILE_DEBUG_HANDLE -O2 -fPIC -Wall -Wextra -Werror=return-type -Werror=non-virtual-dtor -Werror=bool-operation -Wnarrowing -Wno-missing-field-initializers -Wno-type-limits -Wno-array-bounds -Wno-unknown-pragmas -Wno-unused-parameter -Wno-unused-function -Wno-unused-result -Wno-strict-overflow -Wno-strict-aliasing -Wno-stringop-overflow -Wno-psabi -Wno-error=pedantic -Wno-error=old-style-cast -Wno-invalid-partial-specialization -Wno-unused-private-field -Wno-aligned-allocation-unavailable -Wno-missing-braces -fdiagnostics-color=always -faligned-new -Wno-unused-but-set-variable -Wno-maybe-uninitialized -fno-math-errno -fno-trapping-math -Werror=format -Werror=cast-function-type -Wno-stringop-overflow, LAPACK_INFO=mkl, PERF_WITH_AVX=1, PERF_WITH_AVX2=1, PERF_WITH_AVX512=1, TORCH_DISABLE_GPU_ASSERTS=ON, TORCH_VERSION=2.1.2, USE_CUDA=ON, USE_CUDNN=ON, USE_EXCEPTION_PTR=1, USE_GFLAGS=OFF, USE_GLOG=OFF, USE_MKL=ON, USE_MKLDNN=ON, USE_MPI=OFF, USE_NCCL=1, USE_NNPACK=ON, USE_OPENMP=ON, USE_ROCM=OFF, TorchVision: 0.16.2+cu118 OpenCV: 4.9.0 MMEngine: 0.10.5 Runtime environment: cudnn_benchmark: True mp_cfg: {'mp_start_method': 'fork', 'opencv_num_threads': 0} dist_cfg: {'backend': 'nccl'} seed: 629861339 Distributed launcher: pytorch Distributed training: True GPU number: 4 ------------------------------------------------------------ 2024/10/24 21:20:10 - mmengine - INFO - Config: bs_ratio = 4 crop_size = ( 512, 512, ) data_preprocessor = dict( bgr_to_rgb=True, mean=[ 123.675, 116.28, 103.53, ], pad_val=0, seg_pad_val=255, size=( 512, 512, ), std=[ 58.395, 57.12, 57.375, ], type='SegDataPreProcessor') data_root = 'data/ade/ADEChallengeData2016' dataset_type = 'ADE20KDataset' default_hooks = dict( checkpoint=dict(by_epoch=False, interval=8000, type='CheckpointHook'), logger=dict(interval=50, log_metric_by_epoch=False, type='LoggerHook'), param_scheduler=dict(type='ParamSchedulerHook'), sampler_seed=dict(type='DistSamplerSeedHook'), timer=dict(type='IterTimerHook'), visualization=dict(type='SegVisualizationHook')) default_scope = 'mmseg' env_cfg = dict( cudnn_benchmark=True, dist_cfg=dict(backend='nccl'), mp_cfg=dict(mp_start_method='fork', opencv_num_threads=0)) img_ratios = [ 0.5, 0.75, 1.0, 1.25, 1.5, 1.75, ] launcher = 'pytorch' load_from = None log_level = 'INFO' log_processor = dict(by_epoch=False) max_iters = 80000 model = dict( auxiliary_head=dict( align_corners=False, channels=256, concat_input=False, dropout_ratio=0.1, in_channels=376, in_index=1, loss_decode=dict( loss_weight=0.4, type='CrossEntropyLoss', use_sigmoid=False), norm_cfg=dict(requires_grad=True, type='SyncBN'), num_classes=150, num_convs=1, type='FCNHead'), backbone=dict( depth=[ 2, 3, 2, ], distillation=False, down_ops=[ [ 'subsample', 2, ], [ 'subsample', 2, ], [ '', ], ], drop_path=0.03, embed_dim=[ 200, 376, 448, ], forward_type='v052d', frozen_stages=-1, global_ratio=[ 0.8, 0.7, 0.6, ], img_size=224, in_chans=3, kernels=[ 7, 5, 3, ], local_ratio=[ 0.2, 0.2, 0.3, ], norm_eval=False, num_classes=80, num_heads=[ 4, 4, 4, ], out_indices=( 1, 2, 3, ), patch_size=16, pretrained= '../../weights/MobileMamba_B4/mobilemamba_b4.pth', ssm_ratio=2, stages=[ 's', 's', 's', ], sync_bn=False, type='MobileMamba', window_size=[ 7, 7, 7, ]), data_preprocessor=dict( bgr_to_rgb=True, mean=[ 123.675, 116.28, 103.53, ], pad_val=0, seg_pad_val=255, size=( 512, 512, ), std=[ 58.395, 57.12, 57.375, ], type='SegDataPreProcessor'), decode_head=dict( align_corners=False, channels=256, dilations=( 1, 12, 24, 36, ), dropout_ratio=0.1, in_channels=448, in_index=2, loss_decode=dict( loss_weight=1.0, type='CrossEntropyLoss', use_sigmoid=False), norm_cfg=dict(requires_grad=True, type='SyncBN'), num_classes=150, type='ASPPHead'), pretrained=None, test_cfg=dict(mode='whole'), train_cfg=dict(), type='EncoderDecoder') norm_cfg = dict(requires_grad=True, type='SyncBN') optim_wrapper = dict( clip_grad=dict(max_norm=0.1, norm_type=2), optimizer=dict( betas=( 0.9, 0.999, ), lr=0.00012, type='AdamW', weight_decay=0.05), paramwise_cfg=dict( custom_keys=dict( absolute_pos_embed=dict(decay_mult=0.0), norm=dict(decay_mult=0.0), relative_position_bias_table=dict(decay_mult=0.0))), type='OptimWrapper') optimizer = dict(lr=0.01, momentum=0.9, type='SGD', weight_decay=0.0005) param_scheduler = [ dict( begin=0, by_epoch=False, end=500, start_factor=1e-05, type='LinearLR'), dict( T_max=40000, begin=40000, by_epoch=False, end=80000, eta_min=0, type='CosineAnnealingLR'), ] ratio = 1 resume = False test_cfg = dict(type='TestLoop') test_dataloader = dict( batch_size=1, dataset=dict( data_prefix=dict( img_path='images/validation', seg_map_path='annotations/validation'), data_root='data/ade/ADEChallengeData2016', pipeline=[ dict(type='LoadImageFromFile'), dict(keep_ratio=True, scale=( 2048, 512, ), type='Resize'), dict(reduce_zero_label=True, type='LoadAnnotations'), dict(type='PackSegInputs'), ], type='ADE20KDataset'), num_workers=2, persistent_workers=True, sampler=dict(shuffle=False, type='DefaultSampler')) test_evaluator = dict( iou_metrics=[ 'mIoU', ], type='IoUMetric') test_pipeline = [ dict(type='LoadImageFromFile'), dict(keep_ratio=True, scale=( 2048, 512, ), type='Resize'), dict(reduce_zero_label=True, type='LoadAnnotations'), dict(type='PackSegInputs'), ] train_cfg = dict(max_iters=80000, type='IterBasedTrainLoop', val_interval=8000) train_dataloader = dict( batch_size=8, dataset=dict( data_prefix=dict( img_path='images/training', seg_map_path='annotations/training'), data_root='data/ade/ADEChallengeData2016', pipeline=[ dict(type='LoadImageFromFile'), dict(reduce_zero_label=True, type='LoadAnnotations'), dict( keep_ratio=True, ratio_range=( 0.5, 2.0, ), scale=( 2048, 512, ), type='RandomResize'), dict( cat_max_ratio=0.75, crop_size=( 512, 512, ), type='RandomCrop'), dict(prob=0.5, type='RandomFlip'), dict(type='PhotoMetricDistortion'), dict(type='PackSegInputs'), ], type='ADE20KDataset'), num_workers=8, persistent_workers=True, sampler=dict(shuffle=True, type='InfiniteSampler')) train_pipeline = [ dict(type='LoadImageFromFile'), dict(reduce_zero_label=True, type='LoadAnnotations'), dict( keep_ratio=True, ratio_range=( 0.5, 2.0, ), scale=( 2048, 512, ), type='RandomResize'), dict(cat_max_ratio=0.75, crop_size=( 512, 512, ), type='RandomCrop'), dict(prob=0.5, type='RandomFlip'), dict(type='PhotoMetricDistortion'), dict(type='PackSegInputs'), ] tta_model = dict(type='SegTTAModel') tta_pipeline = [ dict(backend_args=None, type='LoadImageFromFile'), dict( transforms=[ [ dict(keep_ratio=True, scale_factor=0.5, type='Resize'), dict(keep_ratio=True, scale_factor=0.75, type='Resize'), dict(keep_ratio=True, scale_factor=1.0, type='Resize'), dict(keep_ratio=True, scale_factor=1.25, type='Resize'), dict(keep_ratio=True, scale_factor=1.5, type='Resize'), dict(keep_ratio=True, scale_factor=1.75, type='Resize'), ], [ dict(direction='horizontal', prob=0.0, type='RandomFlip'), dict(direction='horizontal', prob=1.0, type='RandomFlip'), ], [ dict(type='LoadAnnotations'), ], [ dict(type='PackSegInputs'), ], ], type='TestTimeAug'), ] val_cfg = dict(type='ValLoop') val_dataloader = dict( batch_size=1, dataset=dict( data_prefix=dict( img_path='images/validation', seg_map_path='annotations/validation'), data_root='data/ade/ADEChallengeData2016', pipeline=[ dict(type='LoadImageFromFile'), dict(keep_ratio=True, scale=( 2048, 512, ), type='Resize'), dict(reduce_zero_label=True, type='LoadAnnotations'), dict(type='PackSegInputs'), ], type='ADE20KDataset'), num_workers=2, persistent_workers=True, sampler=dict(shuffle=False, type='DefaultSampler')) val_evaluator = dict( iou_metrics=[ 'mIoU', ], type='IoUMetric') vis_backends = [ dict(type='LocalVisBackend'), ] visualizer = dict( name='visualizer', type='SegLocalVisualizer', vis_backends=[ dict(type='LocalVisBackend'), ]) work_dir = './work_dirs/deeplabv3_mobilemamba_B4-80k_ade20k-512x512' 2024/10/24 21:20:12 - mmengine - INFO - Hooks will be executed in the following order: before_run: (VERY_HIGH ) RuntimeInfoHook (BELOW_NORMAL) LoggerHook -------------------- before_train: (VERY_HIGH ) RuntimeInfoHook (NORMAL ) IterTimerHook (VERY_LOW ) CheckpointHook -------------------- before_train_epoch: (VERY_HIGH ) RuntimeInfoHook (NORMAL ) IterTimerHook (NORMAL ) DistSamplerSeedHook -------------------- before_train_iter: (VERY_HIGH ) RuntimeInfoHook (NORMAL ) IterTimerHook -------------------- after_train_iter: (VERY_HIGH ) RuntimeInfoHook (NORMAL ) IterTimerHook (NORMAL ) SegVisualizationHook (BELOW_NORMAL) LoggerHook (LOW ) ParamSchedulerHook (VERY_LOW ) CheckpointHook -------------------- after_train_epoch: (NORMAL ) IterTimerHook (LOW ) ParamSchedulerHook (VERY_LOW ) CheckpointHook -------------------- before_val: (VERY_HIGH ) RuntimeInfoHook -------------------- before_val_epoch: (NORMAL ) IterTimerHook -------------------- before_val_iter: (NORMAL ) IterTimerHook -------------------- after_val_iter: (NORMAL ) IterTimerHook (NORMAL ) SegVisualizationHook (BELOW_NORMAL) LoggerHook -------------------- after_val_epoch: (VERY_HIGH ) RuntimeInfoHook (NORMAL ) IterTimerHook (BELOW_NORMAL) LoggerHook (LOW ) ParamSchedulerHook (VERY_LOW ) CheckpointHook -------------------- after_val: (VERY_HIGH ) RuntimeInfoHook -------------------- after_train: (VERY_HIGH ) RuntimeInfoHook (VERY_LOW ) CheckpointHook -------------------- before_test: (VERY_HIGH ) RuntimeInfoHook -------------------- before_test_epoch: (NORMAL ) IterTimerHook -------------------- before_test_iter: (NORMAL ) IterTimerHook -------------------- after_test_iter: (NORMAL ) IterTimerHook (NORMAL ) SegVisualizationHook (BELOW_NORMAL) LoggerHook -------------------- after_test_epoch: (VERY_HIGH ) RuntimeInfoHook (NORMAL ) IterTimerHook (BELOW_NORMAL) LoggerHook -------------------- after_test: (VERY_HIGH ) RuntimeInfoHook -------------------- after_run: (BELOW_NORMAL) LoggerHook -------------------- 2024/10/24 21:20:14 - mmengine - WARNING - backbone.blocks1.0.mixer.m.attn.global_op.wt_filter is skipped since its requires_grad=False 2024/10/24 21:20:14 - mmengine - WARNING - backbone.blocks1.0.mixer.m.attn.global_op.iwt_filter is skipped since its requires_grad=False 2024/10/24 21:20:14 - mmengine - INFO - paramwise_options -- backbone.blocks1.0.mixer.m.attn.global_op.global_atten.out_norm.weight:lr=0.00012 2024/10/24 21:20:14 - mmengine - INFO - paramwise_options -- backbone.blocks1.0.mixer.m.attn.global_op.global_atten.out_norm.weight:weight_decay=0.0 2024/10/24 21:20:14 - mmengine - INFO - paramwise_options -- backbone.blocks1.0.mixer.m.attn.global_op.global_atten.out_norm.weight:decay_mult=0.0 2024/10/24 21:20:14 - mmengine - INFO - paramwise_options -- backbone.blocks1.0.mixer.m.attn.global_op.global_atten.out_norm.bias:lr=0.00012 2024/10/24 21:20:14 - mmengine - INFO - paramwise_options -- backbone.blocks1.0.mixer.m.attn.global_op.global_atten.out_norm.bias:weight_decay=0.0 2024/10/24 21:20:14 - mmengine - INFO - paramwise_options -- backbone.blocks1.0.mixer.m.attn.global_op.global_atten.out_norm.bias:decay_mult=0.0 2024/10/24 21:20:14 - mmengine - WARNING - backbone.blocks1.1.mixer.m.attn.global_op.wt_filter is skipped since its requires_grad=False 2024/10/24 21:20:14 - mmengine - WARNING - backbone.blocks1.1.mixer.m.attn.global_op.iwt_filter is skipped since its requires_grad=False 2024/10/24 21:20:14 - mmengine - INFO - paramwise_options -- backbone.blocks1.1.mixer.m.attn.global_op.global_atten.out_norm.weight:lr=0.00012 2024/10/24 21:20:14 - mmengine - INFO - paramwise_options -- backbone.blocks1.1.mixer.m.attn.global_op.global_atten.out_norm.weight:weight_decay=0.0 2024/10/24 21:20:14 - mmengine - INFO - paramwise_options -- backbone.blocks1.1.mixer.m.attn.global_op.global_atten.out_norm.weight:decay_mult=0.0 2024/10/24 21:20:14 - mmengine - INFO - paramwise_options -- backbone.blocks1.1.mixer.m.attn.global_op.global_atten.out_norm.bias:lr=0.00012 2024/10/24 21:20:14 - mmengine - INFO - paramwise_options -- backbone.blocks1.1.mixer.m.attn.global_op.global_atten.out_norm.bias:weight_decay=0.0 2024/10/24 21:20:14 - mmengine - INFO - paramwise_options -- backbone.blocks1.1.mixer.m.attn.global_op.global_atten.out_norm.bias:decay_mult=0.0 2024/10/24 21:20:14 - mmengine - WARNING - backbone.blocks2.3.mixer.m.attn.global_op.wt_filter is skipped since its requires_grad=False 2024/10/24 21:20:14 - mmengine - WARNING - backbone.blocks2.3.mixer.m.attn.global_op.iwt_filter is skipped since its requires_grad=False 2024/10/24 21:20:14 - mmengine - INFO - paramwise_options -- backbone.blocks2.3.mixer.m.attn.global_op.global_atten.out_norm.weight:lr=0.00012 2024/10/24 21:20:14 - mmengine - INFO - paramwise_options -- backbone.blocks2.3.mixer.m.attn.global_op.global_atten.out_norm.weight:weight_decay=0.0 2024/10/24 21:20:14 - mmengine - INFO - paramwise_options -- backbone.blocks2.3.mixer.m.attn.global_op.global_atten.out_norm.weight:decay_mult=0.0 2024/10/24 21:20:14 - mmengine - INFO - paramwise_options -- backbone.blocks2.3.mixer.m.attn.global_op.global_atten.out_norm.bias:lr=0.00012 2024/10/24 21:20:14 - mmengine - INFO - paramwise_options -- backbone.blocks2.3.mixer.m.attn.global_op.global_atten.out_norm.bias:weight_decay=0.0 2024/10/24 21:20:14 - mmengine - INFO - paramwise_options -- backbone.blocks2.3.mixer.m.attn.global_op.global_atten.out_norm.bias:decay_mult=0.0 2024/10/24 21:20:14 - mmengine - WARNING - backbone.blocks2.4.mixer.m.attn.global_op.wt_filter is skipped since its requires_grad=False 2024/10/24 21:20:14 - mmengine - WARNING - backbone.blocks2.4.mixer.m.attn.global_op.iwt_filter is skipped since its requires_grad=False 2024/10/24 21:20:14 - mmengine - INFO - paramwise_options -- backbone.blocks2.4.mixer.m.attn.global_op.global_atten.out_norm.weight:lr=0.00012 2024/10/24 21:20:14 - mmengine - INFO - paramwise_options -- backbone.blocks2.4.mixer.m.attn.global_op.global_atten.out_norm.weight:weight_decay=0.0 2024/10/24 21:20:14 - mmengine - INFO - paramwise_options -- backbone.blocks2.4.mixer.m.attn.global_op.global_atten.out_norm.weight:decay_mult=0.0 2024/10/24 21:20:14 - mmengine - INFO - paramwise_options -- backbone.blocks2.4.mixer.m.attn.global_op.global_atten.out_norm.bias:lr=0.00012 2024/10/24 21:20:14 - mmengine - INFO - paramwise_options -- backbone.blocks2.4.mixer.m.attn.global_op.global_atten.out_norm.bias:weight_decay=0.0 2024/10/24 21:20:14 - mmengine - INFO - paramwise_options -- backbone.blocks2.4.mixer.m.attn.global_op.global_atten.out_norm.bias:decay_mult=0.0 2024/10/24 21:20:14 - mmengine - WARNING - backbone.blocks2.5.mixer.m.attn.global_op.wt_filter is skipped since its requires_grad=False 2024/10/24 21:20:14 - mmengine - WARNING - backbone.blocks2.5.mixer.m.attn.global_op.iwt_filter is skipped since its requires_grad=False 2024/10/24 21:20:14 - mmengine - INFO - paramwise_options -- backbone.blocks2.5.mixer.m.attn.global_op.global_atten.out_norm.weight:lr=0.00012 2024/10/24 21:20:14 - mmengine - INFO - paramwise_options -- backbone.blocks2.5.mixer.m.attn.global_op.global_atten.out_norm.weight:weight_decay=0.0 2024/10/24 21:20:14 - mmengine - INFO - paramwise_options -- backbone.blocks2.5.mixer.m.attn.global_op.global_atten.out_norm.weight:decay_mult=0.0 2024/10/24 21:20:14 - mmengine - INFO - paramwise_options -- backbone.blocks2.5.mixer.m.attn.global_op.global_atten.out_norm.bias:lr=0.00012 2024/10/24 21:20:14 - mmengine - INFO - paramwise_options -- backbone.blocks2.5.mixer.m.attn.global_op.global_atten.out_norm.bias:weight_decay=0.0 2024/10/24 21:20:14 - mmengine - INFO - paramwise_options -- backbone.blocks2.5.mixer.m.attn.global_op.global_atten.out_norm.bias:decay_mult=0.0 2024/10/24 21:20:14 - mmengine - WARNING - backbone.blocks3.3.mixer.m.attn.global_op.wt_filter is skipped since its requires_grad=False 2024/10/24 21:20:14 - mmengine - WARNING - backbone.blocks3.3.mixer.m.attn.global_op.iwt_filter is skipped since its requires_grad=False 2024/10/24 21:20:14 - mmengine - INFO - paramwise_options -- backbone.blocks3.3.mixer.m.attn.global_op.global_atten.out_norm.weight:lr=0.00012 2024/10/24 21:20:14 - mmengine - INFO - paramwise_options -- backbone.blocks3.3.mixer.m.attn.global_op.global_atten.out_norm.weight:weight_decay=0.0 2024/10/24 21:20:14 - mmengine - INFO - paramwise_options -- backbone.blocks3.3.mixer.m.attn.global_op.global_atten.out_norm.weight:decay_mult=0.0 2024/10/24 21:20:14 - mmengine - INFO - paramwise_options -- backbone.blocks3.3.mixer.m.attn.global_op.global_atten.out_norm.bias:lr=0.00012 2024/10/24 21:20:14 - mmengine - INFO - paramwise_options -- backbone.blocks3.3.mixer.m.attn.global_op.global_atten.out_norm.bias:weight_decay=0.0 2024/10/24 21:20:14 - mmengine - INFO - paramwise_options -- backbone.blocks3.3.mixer.m.attn.global_op.global_atten.out_norm.bias:decay_mult=0.0 2024/10/24 21:20:14 - mmengine - WARNING - backbone.blocks3.4.mixer.m.attn.global_op.wt_filter is skipped since its requires_grad=False 2024/10/24 21:20:14 - mmengine - WARNING - backbone.blocks3.4.mixer.m.attn.global_op.iwt_filter is skipped since its requires_grad=False 2024/10/24 21:20:14 - mmengine - INFO - paramwise_options -- backbone.blocks3.4.mixer.m.attn.global_op.global_atten.out_norm.weight:lr=0.00012 2024/10/24 21:20:14 - mmengine - INFO - paramwise_options -- backbone.blocks3.4.mixer.m.attn.global_op.global_atten.out_norm.weight:weight_decay=0.0 2024/10/24 21:20:14 - mmengine - INFO - paramwise_options -- backbone.blocks3.4.mixer.m.attn.global_op.global_atten.out_norm.weight:decay_mult=0.0 2024/10/24 21:20:14 - mmengine - INFO - paramwise_options -- backbone.blocks3.4.mixer.m.attn.global_op.global_atten.out_norm.bias:lr=0.00012 2024/10/24 21:20:14 - mmengine - INFO - paramwise_options -- backbone.blocks3.4.mixer.m.attn.global_op.global_atten.out_norm.bias:weight_decay=0.0 2024/10/24 21:20:14 - mmengine - INFO - paramwise_options -- backbone.blocks3.4.mixer.m.attn.global_op.global_atten.out_norm.bias:decay_mult=0.0 2024/10/24 21:20:14 - mmengine - WARNING - The prefix is not set in metric class IoUMetric. Name of parameter - Initialization information backbone.patch_embed.0.c.weight - torch.Size([25, 3, 3, 3]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.patch_embed.0.bn.weight - torch.Size([25]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.patch_embed.0.bn.bias - torch.Size([25]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.patch_embed.2.c.weight - torch.Size([50, 25, 3, 3]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.patch_embed.2.bn.weight - torch.Size([50]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.patch_embed.2.bn.bias - torch.Size([50]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.patch_embed.4.c.weight - torch.Size([100, 50, 3, 3]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.patch_embed.4.bn.weight - torch.Size([100]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.patch_embed.4.bn.bias - torch.Size([100]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.patch_embed.6.c.weight - torch.Size([200, 100, 3, 3]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.patch_embed.6.bn.weight - torch.Size([200]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.patch_embed.6.bn.bias - torch.Size([200]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks1.0.dw0.m.c.weight - torch.Size([200, 1, 3, 3]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks1.0.dw0.m.bn.weight - torch.Size([200]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks1.0.dw0.m.bn.bias - torch.Size([200]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks1.0.ffn0.m.pw1.c.weight - torch.Size([400, 200, 1, 1]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks1.0.ffn0.m.pw1.bn.weight - torch.Size([400]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks1.0.ffn0.m.pw1.bn.bias - torch.Size([400]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks1.0.ffn0.m.pw2.c.weight - torch.Size([200, 400, 1, 1]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks1.0.ffn0.m.pw2.bn.weight - torch.Size([200]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks1.0.ffn0.m.pw2.bn.bias - torch.Size([200]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks1.0.mixer.m.attn.local_op.dwconv3x3.weight - torch.Size([40, 1, 7, 7]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks1.0.mixer.m.attn.local_op.bn1.weight - torch.Size([40]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks1.0.mixer.m.attn.local_op.bn1.bias - torch.Size([40]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks1.0.mixer.m.attn.local_op.dwconv1x1.weight - torch.Size([40, 1, 1, 1]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks1.0.mixer.m.attn.local_op.bn2.weight - torch.Size([40]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks1.0.mixer.m.attn.local_op.bn2.bias - torch.Size([40]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks1.0.mixer.m.attn.global_op.wt_filter - torch.Size([640, 1, 2, 2]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks1.0.mixer.m.attn.global_op.iwt_filter - torch.Size([640, 1, 2, 2]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks1.0.mixer.m.attn.global_op.global_atten.x_proj_weight - torch.Size([2, 12, 320]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks1.0.mixer.m.attn.global_op.global_atten.Ds - torch.Size([640]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks1.0.mixer.m.attn.global_op.global_atten.A_logs - torch.Size([640, 1]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks1.0.mixer.m.attn.global_op.global_atten.dt_projs_weight - torch.Size([2, 320, 10]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks1.0.mixer.m.attn.global_op.global_atten.dt_projs_bias - torch.Size([2, 320]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks1.0.mixer.m.attn.global_op.global_atten.out_norm.weight - torch.Size([320]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks1.0.mixer.m.attn.global_op.global_atten.out_norm.bias - torch.Size([320]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks1.0.mixer.m.attn.global_op.global_atten.in_proj.c.weight - torch.Size([640, 160, 1, 1]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks1.0.mixer.m.attn.global_op.global_atten.in_proj.bn.weight - torch.Size([640]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks1.0.mixer.m.attn.global_op.global_atten.in_proj.bn.bias - torch.Size([640]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks1.0.mixer.m.attn.global_op.global_atten.conv2d.weight - torch.Size([320, 1, 3, 3]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks1.0.mixer.m.attn.global_op.global_atten.conv2d.bias - torch.Size([320]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks1.0.mixer.m.attn.global_op.global_atten.out_proj.c.weight - torch.Size([160, 320, 1, 1]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks1.0.mixer.m.attn.global_op.global_atten.out_proj.bn.weight - torch.Size([160]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks1.0.mixer.m.attn.global_op.global_atten.out_proj.bn.bias - torch.Size([160]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks1.0.mixer.m.attn.global_op.base_scale.weight - torch.Size([1, 160, 1, 1]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks1.0.mixer.m.attn.global_op.wavelet_convs.0.weight - torch.Size([640, 1, 7, 7]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks1.0.mixer.m.attn.global_op.wavelet_scale.0.weight - torch.Size([1, 640, 1, 1]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks1.0.mixer.m.attn.proj.1.c.weight - torch.Size([200, 200, 1, 1]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks1.0.mixer.m.attn.proj.1.bn.weight - torch.Size([200]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks1.0.mixer.m.attn.proj.1.bn.bias - torch.Size([200]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks1.0.dw1.m.c.weight - torch.Size([200, 1, 3, 3]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks1.0.dw1.m.bn.weight - torch.Size([200]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks1.0.dw1.m.bn.bias - torch.Size([200]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks1.0.ffn1.m.pw1.c.weight - torch.Size([400, 200, 1, 1]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks1.0.ffn1.m.pw1.bn.weight - torch.Size([400]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks1.0.ffn1.m.pw1.bn.bias - torch.Size([400]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks1.0.ffn1.m.pw2.c.weight - torch.Size([200, 400, 1, 1]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks1.0.ffn1.m.pw2.bn.weight - torch.Size([200]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks1.0.ffn1.m.pw2.bn.bias - torch.Size([200]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks1.1.dw0.m.c.weight - torch.Size([200, 1, 3, 3]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks1.1.dw0.m.bn.weight - torch.Size([200]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks1.1.dw0.m.bn.bias - torch.Size([200]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks1.1.ffn0.m.pw1.c.weight - torch.Size([400, 200, 1, 1]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks1.1.ffn0.m.pw1.bn.weight - torch.Size([400]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks1.1.ffn0.m.pw1.bn.bias - torch.Size([400]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks1.1.ffn0.m.pw2.c.weight - torch.Size([200, 400, 1, 1]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks1.1.ffn0.m.pw2.bn.weight - torch.Size([200]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks1.1.ffn0.m.pw2.bn.bias - torch.Size([200]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks1.1.mixer.m.attn.local_op.dwconv3x3.weight - torch.Size([40, 1, 7, 7]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks1.1.mixer.m.attn.local_op.bn1.weight - torch.Size([40]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks1.1.mixer.m.attn.local_op.bn1.bias - torch.Size([40]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks1.1.mixer.m.attn.local_op.dwconv1x1.weight - torch.Size([40, 1, 1, 1]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks1.1.mixer.m.attn.local_op.bn2.weight - torch.Size([40]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks1.1.mixer.m.attn.local_op.bn2.bias - torch.Size([40]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks1.1.mixer.m.attn.global_op.wt_filter - torch.Size([640, 1, 2, 2]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks1.1.mixer.m.attn.global_op.iwt_filter - torch.Size([640, 1, 2, 2]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks1.1.mixer.m.attn.global_op.global_atten.x_proj_weight - torch.Size([2, 12, 320]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks1.1.mixer.m.attn.global_op.global_atten.Ds - torch.Size([640]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks1.1.mixer.m.attn.global_op.global_atten.A_logs - torch.Size([640, 1]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks1.1.mixer.m.attn.global_op.global_atten.dt_projs_weight - torch.Size([2, 320, 10]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks1.1.mixer.m.attn.global_op.global_atten.dt_projs_bias - torch.Size([2, 320]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks1.1.mixer.m.attn.global_op.global_atten.out_norm.weight - torch.Size([320]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks1.1.mixer.m.attn.global_op.global_atten.out_norm.bias - torch.Size([320]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks1.1.mixer.m.attn.global_op.global_atten.in_proj.c.weight - torch.Size([640, 160, 1, 1]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks1.1.mixer.m.attn.global_op.global_atten.in_proj.bn.weight - torch.Size([640]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks1.1.mixer.m.attn.global_op.global_atten.in_proj.bn.bias - torch.Size([640]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks1.1.mixer.m.attn.global_op.global_atten.conv2d.weight - torch.Size([320, 1, 3, 3]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks1.1.mixer.m.attn.global_op.global_atten.conv2d.bias - torch.Size([320]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks1.1.mixer.m.attn.global_op.global_atten.out_proj.c.weight - torch.Size([160, 320, 1, 1]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks1.1.mixer.m.attn.global_op.global_atten.out_proj.bn.weight - torch.Size([160]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks1.1.mixer.m.attn.global_op.global_atten.out_proj.bn.bias - torch.Size([160]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks1.1.mixer.m.attn.global_op.base_scale.weight - torch.Size([1, 160, 1, 1]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks1.1.mixer.m.attn.global_op.wavelet_convs.0.weight - torch.Size([640, 1, 7, 7]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks1.1.mixer.m.attn.global_op.wavelet_scale.0.weight - torch.Size([1, 640, 1, 1]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks1.1.mixer.m.attn.proj.1.c.weight - torch.Size([200, 200, 1, 1]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks1.1.mixer.m.attn.proj.1.bn.weight - torch.Size([200]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks1.1.mixer.m.attn.proj.1.bn.bias - torch.Size([200]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks1.1.dw1.m.c.weight - torch.Size([200, 1, 3, 3]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks1.1.dw1.m.bn.weight - torch.Size([200]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks1.1.dw1.m.bn.bias - torch.Size([200]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks1.1.ffn1.m.pw1.c.weight - torch.Size([400, 200, 1, 1]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks1.1.ffn1.m.pw1.bn.weight - torch.Size([400]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks1.1.ffn1.m.pw1.bn.bias - torch.Size([400]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks1.1.ffn1.m.pw2.c.weight - torch.Size([200, 400, 1, 1]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks1.1.ffn1.m.pw2.bn.weight - torch.Size([200]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks1.1.ffn1.m.pw2.bn.bias - torch.Size([200]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.0.0.m.c.weight - torch.Size([200, 1, 3, 3]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.0.0.m.bn.weight - torch.Size([200]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.0.0.m.bn.bias - torch.Size([200]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.0.1.m.pw1.c.weight - torch.Size([400, 200, 1, 1]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.0.1.m.pw1.bn.weight - torch.Size([400]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.0.1.m.pw1.bn.bias - torch.Size([400]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.0.1.m.pw2.c.weight - torch.Size([200, 400, 1, 1]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.0.1.m.pw2.bn.weight - torch.Size([200]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.0.1.m.pw2.bn.bias - torch.Size([200]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.1.conv1.c.weight - torch.Size([800, 200, 1, 1]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.1.conv1.bn.weight - torch.Size([800]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.1.conv1.bn.bias - torch.Size([800]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.1.conv2.c.weight - torch.Size([800, 1, 3, 3]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.1.conv2.bn.weight - torch.Size([800]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.1.conv2.bn.bias - torch.Size([800]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.1.se.fc1.weight - torch.Size([200, 800, 1, 1]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.1.se.fc1.bias - torch.Size([200]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.1.se.fc2.weight - torch.Size([800, 200, 1, 1]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.1.se.fc2.bias - torch.Size([800]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.1.conv3.c.weight - torch.Size([376, 800, 1, 1]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.1.conv3.bn.weight - torch.Size([376]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.1.conv3.bn.bias - torch.Size([376]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.2.0.m.c.weight - torch.Size([376, 1, 3, 3]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.2.0.m.bn.weight - torch.Size([376]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.2.0.m.bn.bias - torch.Size([376]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.2.1.m.pw1.c.weight - torch.Size([752, 376, 1, 1]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.2.1.m.pw1.bn.weight - torch.Size([752]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.2.1.m.pw1.bn.bias - torch.Size([752]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.2.1.m.pw2.c.weight - torch.Size([376, 752, 1, 1]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.2.1.m.pw2.bn.weight - torch.Size([376]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.2.1.m.pw2.bn.bias - torch.Size([376]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.3.dw0.m.c.weight - torch.Size([376, 1, 3, 3]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.3.dw0.m.bn.weight - torch.Size([376]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.3.dw0.m.bn.bias - torch.Size([376]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.3.ffn0.m.pw1.c.weight - torch.Size([752, 376, 1, 1]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.3.ffn0.m.pw1.bn.weight - torch.Size([752]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.3.ffn0.m.pw1.bn.bias - torch.Size([752]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.3.ffn0.m.pw2.c.weight - torch.Size([376, 752, 1, 1]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.3.ffn0.m.pw2.bn.weight - torch.Size([376]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.3.ffn0.m.pw2.bn.bias - torch.Size([376]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.3.mixer.m.attn.local_op.dwconv3x3.weight - torch.Size([75, 1, 5, 5]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.3.mixer.m.attn.local_op.bn1.weight - torch.Size([75]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.3.mixer.m.attn.local_op.bn1.bias - torch.Size([75]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.3.mixer.m.attn.local_op.dwconv1x1.weight - torch.Size([75, 1, 1, 1]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.3.mixer.m.attn.local_op.bn2.weight - torch.Size([75]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.3.mixer.m.attn.local_op.bn2.bias - torch.Size([75]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.3.mixer.m.attn.global_op.wt_filter - torch.Size([1024, 1, 2, 2]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.3.mixer.m.attn.global_op.iwt_filter - torch.Size([1024, 1, 2, 2]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.3.mixer.m.attn.global_op.global_atten.x_proj_weight - torch.Size([2, 18, 512]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.3.mixer.m.attn.global_op.global_atten.Ds - torch.Size([1024]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.3.mixer.m.attn.global_op.global_atten.A_logs - torch.Size([1024, 1]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.3.mixer.m.attn.global_op.global_atten.dt_projs_weight - torch.Size([2, 512, 16]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.3.mixer.m.attn.global_op.global_atten.dt_projs_bias - torch.Size([2, 512]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.3.mixer.m.attn.global_op.global_atten.out_norm.weight - torch.Size([512]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.3.mixer.m.attn.global_op.global_atten.out_norm.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.3.mixer.m.attn.global_op.global_atten.in_proj.c.weight - torch.Size([1024, 256, 1, 1]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.3.mixer.m.attn.global_op.global_atten.in_proj.bn.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.3.mixer.m.attn.global_op.global_atten.in_proj.bn.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.3.mixer.m.attn.global_op.global_atten.conv2d.weight - torch.Size([512, 1, 3, 3]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.3.mixer.m.attn.global_op.global_atten.conv2d.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.3.mixer.m.attn.global_op.global_atten.out_proj.c.weight - torch.Size([256, 512, 1, 1]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.3.mixer.m.attn.global_op.global_atten.out_proj.bn.weight - torch.Size([256]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.3.mixer.m.attn.global_op.global_atten.out_proj.bn.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.3.mixer.m.attn.global_op.base_scale.weight - torch.Size([1, 256, 1, 1]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.3.mixer.m.attn.global_op.wavelet_convs.0.weight - torch.Size([1024, 1, 5, 5]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.3.mixer.m.attn.global_op.wavelet_scale.0.weight - torch.Size([1, 1024, 1, 1]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.3.mixer.m.attn.proj.1.c.weight - torch.Size([376, 376, 1, 1]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.3.mixer.m.attn.proj.1.bn.weight - torch.Size([376]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.3.mixer.m.attn.proj.1.bn.bias - torch.Size([376]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.3.dw1.m.c.weight - torch.Size([376, 1, 3, 3]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.3.dw1.m.bn.weight - torch.Size([376]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.3.dw1.m.bn.bias - torch.Size([376]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.3.ffn1.m.pw1.c.weight - torch.Size([752, 376, 1, 1]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.3.ffn1.m.pw1.bn.weight - torch.Size([752]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.3.ffn1.m.pw1.bn.bias - torch.Size([752]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.3.ffn1.m.pw2.c.weight - torch.Size([376, 752, 1, 1]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.3.ffn1.m.pw2.bn.weight - torch.Size([376]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.3.ffn1.m.pw2.bn.bias - torch.Size([376]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.4.dw0.m.c.weight - torch.Size([376, 1, 3, 3]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.4.dw0.m.bn.weight - torch.Size([376]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.4.dw0.m.bn.bias - torch.Size([376]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.4.ffn0.m.pw1.c.weight - torch.Size([752, 376, 1, 1]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.4.ffn0.m.pw1.bn.weight - torch.Size([752]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.4.ffn0.m.pw1.bn.bias - torch.Size([752]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.4.ffn0.m.pw2.c.weight - torch.Size([376, 752, 1, 1]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.4.ffn0.m.pw2.bn.weight - torch.Size([376]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.4.ffn0.m.pw2.bn.bias - torch.Size([376]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.4.mixer.m.attn.local_op.dwconv3x3.weight - torch.Size([75, 1, 5, 5]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.4.mixer.m.attn.local_op.bn1.weight - torch.Size([75]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.4.mixer.m.attn.local_op.bn1.bias - torch.Size([75]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.4.mixer.m.attn.local_op.dwconv1x1.weight - torch.Size([75, 1, 1, 1]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.4.mixer.m.attn.local_op.bn2.weight - torch.Size([75]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.4.mixer.m.attn.local_op.bn2.bias - torch.Size([75]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.4.mixer.m.attn.global_op.wt_filter - torch.Size([1024, 1, 2, 2]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.4.mixer.m.attn.global_op.iwt_filter - torch.Size([1024, 1, 2, 2]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.4.mixer.m.attn.global_op.global_atten.x_proj_weight - torch.Size([2, 18, 512]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.4.mixer.m.attn.global_op.global_atten.Ds - torch.Size([1024]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.4.mixer.m.attn.global_op.global_atten.A_logs - torch.Size([1024, 1]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.4.mixer.m.attn.global_op.global_atten.dt_projs_weight - torch.Size([2, 512, 16]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.4.mixer.m.attn.global_op.global_atten.dt_projs_bias - torch.Size([2, 512]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.4.mixer.m.attn.global_op.global_atten.out_norm.weight - torch.Size([512]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.4.mixer.m.attn.global_op.global_atten.out_norm.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.4.mixer.m.attn.global_op.global_atten.in_proj.c.weight - torch.Size([1024, 256, 1, 1]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.4.mixer.m.attn.global_op.global_atten.in_proj.bn.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.4.mixer.m.attn.global_op.global_atten.in_proj.bn.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.4.mixer.m.attn.global_op.global_atten.conv2d.weight - torch.Size([512, 1, 3, 3]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.4.mixer.m.attn.global_op.global_atten.conv2d.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.4.mixer.m.attn.global_op.global_atten.out_proj.c.weight - torch.Size([256, 512, 1, 1]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.4.mixer.m.attn.global_op.global_atten.out_proj.bn.weight - torch.Size([256]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.4.mixer.m.attn.global_op.global_atten.out_proj.bn.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.4.mixer.m.attn.global_op.base_scale.weight - torch.Size([1, 256, 1, 1]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.4.mixer.m.attn.global_op.wavelet_convs.0.weight - torch.Size([1024, 1, 5, 5]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.4.mixer.m.attn.global_op.wavelet_scale.0.weight - torch.Size([1, 1024, 1, 1]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.4.mixer.m.attn.proj.1.c.weight - torch.Size([376, 376, 1, 1]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.4.mixer.m.attn.proj.1.bn.weight - torch.Size([376]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.4.mixer.m.attn.proj.1.bn.bias - torch.Size([376]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.4.dw1.m.c.weight - torch.Size([376, 1, 3, 3]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.4.dw1.m.bn.weight - torch.Size([376]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.4.dw1.m.bn.bias - torch.Size([376]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.4.ffn1.m.pw1.c.weight - torch.Size([752, 376, 1, 1]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.4.ffn1.m.pw1.bn.weight - torch.Size([752]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.4.ffn1.m.pw1.bn.bias - torch.Size([752]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.4.ffn1.m.pw2.c.weight - torch.Size([376, 752, 1, 1]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.4.ffn1.m.pw2.bn.weight - torch.Size([376]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.4.ffn1.m.pw2.bn.bias - torch.Size([376]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.5.dw0.m.c.weight - torch.Size([376, 1, 3, 3]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.5.dw0.m.bn.weight - torch.Size([376]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.5.dw0.m.bn.bias - torch.Size([376]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.5.ffn0.m.pw1.c.weight - torch.Size([752, 376, 1, 1]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.5.ffn0.m.pw1.bn.weight - torch.Size([752]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.5.ffn0.m.pw1.bn.bias - torch.Size([752]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.5.ffn0.m.pw2.c.weight - torch.Size([376, 752, 1, 1]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.5.ffn0.m.pw2.bn.weight - torch.Size([376]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.5.ffn0.m.pw2.bn.bias - torch.Size([376]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.5.mixer.m.attn.local_op.dwconv3x3.weight - torch.Size([75, 1, 5, 5]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.5.mixer.m.attn.local_op.bn1.weight - torch.Size([75]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.5.mixer.m.attn.local_op.bn1.bias - torch.Size([75]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.5.mixer.m.attn.local_op.dwconv1x1.weight - torch.Size([75, 1, 1, 1]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.5.mixer.m.attn.local_op.bn2.weight - torch.Size([75]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.5.mixer.m.attn.local_op.bn2.bias - torch.Size([75]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.5.mixer.m.attn.global_op.wt_filter - torch.Size([1024, 1, 2, 2]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.5.mixer.m.attn.global_op.iwt_filter - torch.Size([1024, 1, 2, 2]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.5.mixer.m.attn.global_op.global_atten.x_proj_weight - torch.Size([2, 18, 512]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.5.mixer.m.attn.global_op.global_atten.Ds - torch.Size([1024]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.5.mixer.m.attn.global_op.global_atten.A_logs - torch.Size([1024, 1]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.5.mixer.m.attn.global_op.global_atten.dt_projs_weight - torch.Size([2, 512, 16]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.5.mixer.m.attn.global_op.global_atten.dt_projs_bias - torch.Size([2, 512]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.5.mixer.m.attn.global_op.global_atten.out_norm.weight - torch.Size([512]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.5.mixer.m.attn.global_op.global_atten.out_norm.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.5.mixer.m.attn.global_op.global_atten.in_proj.c.weight - torch.Size([1024, 256, 1, 1]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.5.mixer.m.attn.global_op.global_atten.in_proj.bn.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.5.mixer.m.attn.global_op.global_atten.in_proj.bn.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.5.mixer.m.attn.global_op.global_atten.conv2d.weight - torch.Size([512, 1, 3, 3]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.5.mixer.m.attn.global_op.global_atten.conv2d.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.5.mixer.m.attn.global_op.global_atten.out_proj.c.weight - torch.Size([256, 512, 1, 1]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.5.mixer.m.attn.global_op.global_atten.out_proj.bn.weight - torch.Size([256]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.5.mixer.m.attn.global_op.global_atten.out_proj.bn.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.5.mixer.m.attn.global_op.base_scale.weight - torch.Size([1, 256, 1, 1]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.5.mixer.m.attn.global_op.wavelet_convs.0.weight - torch.Size([1024, 1, 5, 5]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.5.mixer.m.attn.global_op.wavelet_scale.0.weight - torch.Size([1, 1024, 1, 1]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.5.mixer.m.attn.proj.1.c.weight - torch.Size([376, 376, 1, 1]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.5.mixer.m.attn.proj.1.bn.weight - torch.Size([376]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.5.mixer.m.attn.proj.1.bn.bias - torch.Size([376]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.5.dw1.m.c.weight - torch.Size([376, 1, 3, 3]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.5.dw1.m.bn.weight - torch.Size([376]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.5.dw1.m.bn.bias - torch.Size([376]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.5.ffn1.m.pw1.c.weight - torch.Size([752, 376, 1, 1]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.5.ffn1.m.pw1.bn.weight - torch.Size([752]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.5.ffn1.m.pw1.bn.bias - torch.Size([752]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.5.ffn1.m.pw2.c.weight - torch.Size([376, 752, 1, 1]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.5.ffn1.m.pw2.bn.weight - torch.Size([376]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks2.5.ffn1.m.pw2.bn.bias - torch.Size([376]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.0.0.m.c.weight - torch.Size([376, 1, 3, 3]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.0.0.m.bn.weight - torch.Size([376]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.0.0.m.bn.bias - torch.Size([376]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.0.1.m.pw1.c.weight - torch.Size([752, 376, 1, 1]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.0.1.m.pw1.bn.weight - torch.Size([752]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.0.1.m.pw1.bn.bias - torch.Size([752]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.0.1.m.pw2.c.weight - torch.Size([376, 752, 1, 1]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.0.1.m.pw2.bn.weight - torch.Size([376]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.0.1.m.pw2.bn.bias - torch.Size([376]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.1.conv1.c.weight - torch.Size([1504, 376, 1, 1]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.1.conv1.bn.weight - torch.Size([1504]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.1.conv1.bn.bias - torch.Size([1504]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.1.conv2.c.weight - torch.Size([1504, 1, 3, 3]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.1.conv2.bn.weight - torch.Size([1504]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.1.conv2.bn.bias - torch.Size([1504]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.1.se.fc1.weight - torch.Size([376, 1504, 1, 1]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.1.se.fc1.bias - torch.Size([376]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.1.se.fc2.weight - torch.Size([1504, 376, 1, 1]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.1.se.fc2.bias - torch.Size([1504]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.1.conv3.c.weight - torch.Size([448, 1504, 1, 1]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.1.conv3.bn.weight - torch.Size([448]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.1.conv3.bn.bias - torch.Size([448]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.2.0.m.c.weight - torch.Size([448, 1, 3, 3]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.2.0.m.bn.weight - torch.Size([448]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.2.0.m.bn.bias - torch.Size([448]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.2.1.m.pw1.c.weight - torch.Size([896, 448, 1, 1]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.2.1.m.pw1.bn.weight - torch.Size([896]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.2.1.m.pw1.bn.bias - torch.Size([896]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.2.1.m.pw2.c.weight - torch.Size([448, 896, 1, 1]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.2.1.m.pw2.bn.weight - torch.Size([448]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.2.1.m.pw2.bn.bias - torch.Size([448]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.3.dw0.m.c.weight - torch.Size([448, 1, 3, 3]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.3.dw0.m.bn.weight - torch.Size([448]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.3.dw0.m.bn.bias - torch.Size([448]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.3.ffn0.m.pw1.c.weight - torch.Size([896, 448, 1, 1]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.3.ffn0.m.pw1.bn.weight - torch.Size([896]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.3.ffn0.m.pw1.bn.bias - torch.Size([896]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.3.ffn0.m.pw2.c.weight - torch.Size([448, 896, 1, 1]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.3.ffn0.m.pw2.bn.weight - torch.Size([448]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.3.ffn0.m.pw2.bn.bias - torch.Size([448]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.3.mixer.m.attn.local_op.dwconv3x3.weight - torch.Size([134, 1, 3, 3]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.3.mixer.m.attn.local_op.bn1.weight - torch.Size([134]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.3.mixer.m.attn.local_op.bn1.bias - torch.Size([134]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.3.mixer.m.attn.local_op.dwconv1x1.weight - torch.Size([134, 1, 1, 1]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.3.mixer.m.attn.local_op.bn2.weight - torch.Size([134]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.3.mixer.m.attn.local_op.bn2.bias - torch.Size([134]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.3.mixer.m.attn.global_op.wt_filter - torch.Size([1088, 1, 2, 2]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.3.mixer.m.attn.global_op.iwt_filter - torch.Size([1088, 1, 2, 2]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.3.mixer.m.attn.global_op.global_atten.x_proj_weight - torch.Size([2, 19, 544]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.3.mixer.m.attn.global_op.global_atten.Ds - torch.Size([1088]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.3.mixer.m.attn.global_op.global_atten.A_logs - torch.Size([1088, 1]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.3.mixer.m.attn.global_op.global_atten.dt_projs_weight - torch.Size([2, 544, 17]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.3.mixer.m.attn.global_op.global_atten.dt_projs_bias - torch.Size([2, 544]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.3.mixer.m.attn.global_op.global_atten.out_norm.weight - torch.Size([544]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.3.mixer.m.attn.global_op.global_atten.out_norm.bias - torch.Size([544]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.3.mixer.m.attn.global_op.global_atten.in_proj.c.weight - torch.Size([1088, 272, 1, 1]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.3.mixer.m.attn.global_op.global_atten.in_proj.bn.weight - torch.Size([1088]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.3.mixer.m.attn.global_op.global_atten.in_proj.bn.bias - torch.Size([1088]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.3.mixer.m.attn.global_op.global_atten.conv2d.weight - torch.Size([544, 1, 3, 3]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.3.mixer.m.attn.global_op.global_atten.conv2d.bias - torch.Size([544]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.3.mixer.m.attn.global_op.global_atten.out_proj.c.weight - torch.Size([272, 544, 1, 1]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.3.mixer.m.attn.global_op.global_atten.out_proj.bn.weight - torch.Size([272]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.3.mixer.m.attn.global_op.global_atten.out_proj.bn.bias - torch.Size([272]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.3.mixer.m.attn.global_op.base_scale.weight - torch.Size([1, 272, 1, 1]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.3.mixer.m.attn.global_op.wavelet_convs.0.weight - torch.Size([1088, 1, 3, 3]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.3.mixer.m.attn.global_op.wavelet_scale.0.weight - torch.Size([1, 1088, 1, 1]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.3.mixer.m.attn.proj.1.c.weight - torch.Size([448, 448, 1, 1]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.3.mixer.m.attn.proj.1.bn.weight - torch.Size([448]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.3.mixer.m.attn.proj.1.bn.bias - torch.Size([448]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.3.dw1.m.c.weight - torch.Size([448, 1, 3, 3]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.3.dw1.m.bn.weight - torch.Size([448]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.3.dw1.m.bn.bias - torch.Size([448]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.3.ffn1.m.pw1.c.weight - torch.Size([896, 448, 1, 1]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.3.ffn1.m.pw1.bn.weight - torch.Size([896]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.3.ffn1.m.pw1.bn.bias - torch.Size([896]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.3.ffn1.m.pw2.c.weight - torch.Size([448, 896, 1, 1]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.3.ffn1.m.pw2.bn.weight - torch.Size([448]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.3.ffn1.m.pw2.bn.bias - torch.Size([448]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.4.dw0.m.c.weight - torch.Size([448, 1, 3, 3]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.4.dw0.m.bn.weight - torch.Size([448]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.4.dw0.m.bn.bias - torch.Size([448]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.4.ffn0.m.pw1.c.weight - torch.Size([896, 448, 1, 1]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.4.ffn0.m.pw1.bn.weight - torch.Size([896]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.4.ffn0.m.pw1.bn.bias - torch.Size([896]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.4.ffn0.m.pw2.c.weight - torch.Size([448, 896, 1, 1]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.4.ffn0.m.pw2.bn.weight - torch.Size([448]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.4.ffn0.m.pw2.bn.bias - torch.Size([448]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.4.mixer.m.attn.local_op.dwconv3x3.weight - torch.Size([134, 1, 3, 3]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.4.mixer.m.attn.local_op.bn1.weight - torch.Size([134]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.4.mixer.m.attn.local_op.bn1.bias - torch.Size([134]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.4.mixer.m.attn.local_op.dwconv1x1.weight - torch.Size([134, 1, 1, 1]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.4.mixer.m.attn.local_op.bn2.weight - torch.Size([134]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.4.mixer.m.attn.local_op.bn2.bias - torch.Size([134]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.4.mixer.m.attn.global_op.wt_filter - torch.Size([1088, 1, 2, 2]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.4.mixer.m.attn.global_op.iwt_filter - torch.Size([1088, 1, 2, 2]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.4.mixer.m.attn.global_op.global_atten.x_proj_weight - torch.Size([2, 19, 544]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.4.mixer.m.attn.global_op.global_atten.Ds - torch.Size([1088]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.4.mixer.m.attn.global_op.global_atten.A_logs - torch.Size([1088, 1]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.4.mixer.m.attn.global_op.global_atten.dt_projs_weight - torch.Size([2, 544, 17]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.4.mixer.m.attn.global_op.global_atten.dt_projs_bias - torch.Size([2, 544]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.4.mixer.m.attn.global_op.global_atten.out_norm.weight - torch.Size([544]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.4.mixer.m.attn.global_op.global_atten.out_norm.bias - torch.Size([544]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.4.mixer.m.attn.global_op.global_atten.in_proj.c.weight - torch.Size([1088, 272, 1, 1]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.4.mixer.m.attn.global_op.global_atten.in_proj.bn.weight - torch.Size([1088]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.4.mixer.m.attn.global_op.global_atten.in_proj.bn.bias - torch.Size([1088]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.4.mixer.m.attn.global_op.global_atten.conv2d.weight - torch.Size([544, 1, 3, 3]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.4.mixer.m.attn.global_op.global_atten.conv2d.bias - torch.Size([544]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.4.mixer.m.attn.global_op.global_atten.out_proj.c.weight - torch.Size([272, 544, 1, 1]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.4.mixer.m.attn.global_op.global_atten.out_proj.bn.weight - torch.Size([272]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.4.mixer.m.attn.global_op.global_atten.out_proj.bn.bias - torch.Size([272]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.4.mixer.m.attn.global_op.base_scale.weight - torch.Size([1, 272, 1, 1]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.4.mixer.m.attn.global_op.wavelet_convs.0.weight - torch.Size([1088, 1, 3, 3]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.4.mixer.m.attn.global_op.wavelet_scale.0.weight - torch.Size([1, 1088, 1, 1]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.4.mixer.m.attn.proj.1.c.weight - torch.Size([448, 448, 1, 1]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.4.mixer.m.attn.proj.1.bn.weight - torch.Size([448]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.4.mixer.m.attn.proj.1.bn.bias - torch.Size([448]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.4.dw1.m.c.weight - torch.Size([448, 1, 3, 3]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.4.dw1.m.bn.weight - torch.Size([448]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.4.dw1.m.bn.bias - torch.Size([448]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.4.ffn1.m.pw1.c.weight - torch.Size([896, 448, 1, 1]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.4.ffn1.m.pw1.bn.weight - torch.Size([896]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.4.ffn1.m.pw1.bn.bias - torch.Size([896]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.4.ffn1.m.pw2.c.weight - torch.Size([448, 896, 1, 1]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.4.ffn1.m.pw2.bn.weight - torch.Size([448]): The value is the same before and after calling `init_weights` of EncoderDecoder backbone.blocks3.4.ffn1.m.pw2.bn.bias - torch.Size([448]): The value is the same before and after calling `init_weights` of EncoderDecoder decode_head.conv_seg.weight - torch.Size([150, 256, 1, 1]): NormalInit: mean=0, std=0.01, bias=0 decode_head.conv_seg.bias - torch.Size([150]): NormalInit: mean=0, std=0.01, bias=0 decode_head.image_pool.1.conv.weight - torch.Size([256, 448, 1, 1]): The value is the same before and after calling `init_weights` of EncoderDecoder decode_head.image_pool.1.bn.weight - torch.Size([256]): The value is the same before and after calling `init_weights` of EncoderDecoder decode_head.image_pool.1.bn.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of EncoderDecoder decode_head.aspp_modules.0.conv.weight - torch.Size([256, 448, 1, 1]): The value is the same before and after calling `init_weights` of EncoderDecoder decode_head.aspp_modules.0.bn.weight - torch.Size([256]): The value is the same before and after calling `init_weights` of EncoderDecoder decode_head.aspp_modules.0.bn.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of EncoderDecoder decode_head.aspp_modules.1.conv.weight - torch.Size([256, 448, 3, 3]): The value is the same before and after calling `init_weights` of EncoderDecoder decode_head.aspp_modules.1.bn.weight - torch.Size([256]): The value is the same before and after calling `init_weights` of EncoderDecoder decode_head.aspp_modules.1.bn.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of EncoderDecoder decode_head.aspp_modules.2.conv.weight - torch.Size([256, 448, 3, 3]): The value is the same before and after calling `init_weights` of EncoderDecoder decode_head.aspp_modules.2.bn.weight - torch.Size([256]): The value is the same before and after calling `init_weights` of EncoderDecoder decode_head.aspp_modules.2.bn.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of EncoderDecoder decode_head.aspp_modules.3.conv.weight - torch.Size([256, 448, 3, 3]): The value is the same before and after calling `init_weights` of EncoderDecoder decode_head.aspp_modules.3.bn.weight - torch.Size([256]): The value is the same before and after calling `init_weights` of EncoderDecoder decode_head.aspp_modules.3.bn.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of EncoderDecoder decode_head.bottleneck.conv.weight - torch.Size([256, 1280, 3, 3]): Initialized by user-defined `init_weights` in ConvModule decode_head.bottleneck.bn.weight - torch.Size([256]): The value is the same before and after calling `init_weights` of EncoderDecoder decode_head.bottleneck.bn.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of EncoderDecoder auxiliary_head.conv_seg.weight - torch.Size([150, 256, 1, 1]): NormalInit: mean=0, std=0.01, bias=0 auxiliary_head.conv_seg.bias - torch.Size([150]): NormalInit: mean=0, std=0.01, bias=0 auxiliary_head.convs.0.conv.weight - torch.Size([256, 376, 3, 3]): The value is the same before and after calling `init_weights` of EncoderDecoder auxiliary_head.convs.0.bn.weight - torch.Size([256]): The value is the same before and after calling `init_weights` of EncoderDecoder auxiliary_head.convs.0.bn.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of EncoderDecoder 2024/10/24 21:20:15 - mmengine - WARNING - "FileClient" will be deprecated in future. Please use io functions in https://mmengine.readthedocs.io/en/latest/api/fileio.html#file-io 2024/10/24 21:20:15 - mmengine - WARNING - "HardDiskBackend" is the alias of "LocalBackend" and the former will be deprecated in future. 2024/10/24 21:20:15 - mmengine - INFO - Checkpoints will be saved to work_dirs/deeplabv3_mobilemamba_B4-80k_ade20k-512x512. 2024/10/24 21:22:04 - mmengine - INFO - Iter(train) [ 50/80000] base_lr: 1.1785e-05 lr: 1.1785e-05 eta: 2 days, 0:06:13 time: 1.9753 data_time: 0.0177 memory: 14618 grad_norm: 6.3332 loss: 5.6526 decode.loss_ce: 4.0384 decode.acc_seg: 5.8709 aux.loss_ce: 1.6142 aux.acc_seg: 1.7487 2024/10/24 21:23:45 - mmengine - INFO - Iter(train) [ 100/80000] base_lr: 2.3809e-05 lr: 2.3809e-05 eta: 1 day, 22:39:40 time: 1.9911 data_time: 0.0138 memory: 6643 grad_norm: 5.9773 loss: 5.4790 decode.loss_ce: 3.9127 decode.acc_seg: 12.9730 aux.loss_ce: 1.5663 aux.acc_seg: 15.3371 2024/10/24 21:25:24 - mmengine - INFO - Iter(train) [ 150/80000] base_lr: 3.5833e-05 lr: 3.5833e-05 eta: 1 day, 21:38:37 time: 1.9599 data_time: 0.0170 memory: 6644 grad_norm: 7.7981 loss: 4.8607 decode.loss_ce: 3.4565 decode.acc_seg: 39.7969 aux.loss_ce: 1.4041 aux.acc_seg: 32.0886 2024/10/24 21:27:02 - mmengine - INFO - Iter(train) [ 200/80000] base_lr: 4.7856e-05 lr: 4.7856e-05 eta: 1 day, 21:03:46 time: 1.9575 data_time: 0.0169 memory: 6645 grad_norm: 6.0861 loss: 5.1533 decode.loss_ce: 3.6538 decode.acc_seg: 22.9739 aux.loss_ce: 1.4996 aux.acc_seg: 23.0359 2024/10/24 21:28:44 - mmengine - INFO - Iter(train) [ 250/80000] base_lr: 5.9880e-05 lr: 5.9880e-05 eta: 1 day, 21:04:38 time: 1.9540 data_time: 0.0174 memory: 6645 grad_norm: 5.5364 loss: 4.1302 decode.loss_ce: 2.9286 decode.acc_seg: 37.9815 aux.loss_ce: 1.2016 aux.acc_seg: 37.6379 2024/10/24 21:30:21 - mmengine - INFO - Iter(train) [ 300/80000] base_lr: 7.1904e-05 lr: 7.1904e-05 eta: 1 day, 20:43:50 time: 1.9473 data_time: 0.0170 memory: 6643 grad_norm: 6.6195 loss: 3.5055 decode.loss_ce: 2.4889 decode.acc_seg: 45.7882 aux.loss_ce: 1.0166 aux.acc_seg: 39.2430 2024/10/24 21:31:59 - mmengine - INFO - Iter(train) [ 350/80000] base_lr: 8.3928e-05 lr: 8.3928e-05 eta: 1 day, 20:28:30 time: 1.9486 data_time: 0.0172 memory: 6643 grad_norm: 5.1062 loss: 3.5696 decode.loss_ce: 2.5211 decode.acc_seg: 54.8967 aux.loss_ce: 1.0485 aux.acc_seg: 42.2399 2024/10/24 21:33:36 - mmengine - INFO - Iter(train) [ 400/80000] base_lr: 9.5952e-05 lr: 9.5952e-05 eta: 1 day, 20:15:59 time: 1.9441 data_time: 0.0167 memory: 6645 grad_norm: 5.4504 loss: 3.2122 decode.loss_ce: 2.2805 decode.acc_seg: 33.9337 aux.loss_ce: 0.9317 aux.acc_seg: 33.4082 2024/10/24 21:35:14 - mmengine - INFO - Iter(train) [ 450/80000] base_lr: 1.0798e-04 lr: 1.0798e-04 eta: 1 day, 20:06:38 time: 1.9678 data_time: 0.0155 memory: 6643 grad_norm: 5.8295 loss: 2.8051 decode.loss_ce: 1.9859 decode.acc_seg: 54.1562 aux.loss_ce: 0.8192 aux.acc_seg: 53.1178 2024/10/24 21:36:51 - mmengine - INFO - Iter(train) [ 500/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 19:58:23 time: 1.9470 data_time: 0.0168 memory: 6643 grad_norm: 6.7451 loss: 2.4287 decode.loss_ce: 1.7146 decode.acc_seg: 50.1136 aux.loss_ce: 0.7141 aux.acc_seg: 53.7663 2024/10/24 21:38:29 - mmengine - INFO - Iter(train) [ 550/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 19:52:51 time: 1.9918 data_time: 0.0156 memory: 6643 grad_norm: 6.3664 loss: 2.3665 decode.loss_ce: 1.6734 decode.acc_seg: 52.2068 aux.loss_ce: 0.6931 aux.acc_seg: 53.4351 2024/10/24 21:40:06 - mmengine - INFO - Iter(train) [ 600/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 19:47:14 time: 1.9538 data_time: 0.0159 memory: 6643 grad_norm: 4.9070 loss: 2.5350 decode.loss_ce: 1.7855 decode.acc_seg: 39.7007 aux.loss_ce: 0.7495 aux.acc_seg: 35.6692 2024/10/24 21:41:44 - mmengine - INFO - Iter(train) [ 650/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 19:42:22 time: 1.9564 data_time: 0.0166 memory: 6645 grad_norm: 5.9531 loss: 2.6253 decode.loss_ce: 1.8469 decode.acc_seg: 69.2478 aux.loss_ce: 0.7784 aux.acc_seg: 53.8458 2024/10/24 21:43:22 - mmengine - INFO - Iter(train) [ 700/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 19:38:55 time: 1.9558 data_time: 0.0165 memory: 6644 grad_norm: 6.3520 loss: 2.1703 decode.loss_ce: 1.5189 decode.acc_seg: 62.8415 aux.loss_ce: 0.6515 aux.acc_seg: 62.5030 2024/10/24 21:45:00 - mmengine - INFO - Iter(train) [ 750/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 19:35:11 time: 1.9568 data_time: 0.0162 memory: 6643 grad_norm: 8.3637 loss: 2.1590 decode.loss_ce: 1.5294 decode.acc_seg: 57.6995 aux.loss_ce: 0.6296 aux.acc_seg: 57.7778 2024/10/24 21:46:38 - mmengine - INFO - Iter(train) [ 800/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 19:31:45 time: 1.9561 data_time: 0.0159 memory: 6644 grad_norm: 6.9487 loss: 2.3025 decode.loss_ce: 1.6202 decode.acc_seg: 41.6108 aux.loss_ce: 0.6824 aux.acc_seg: 40.7280 2024/10/24 21:48:16 - mmengine - INFO - Iter(train) [ 850/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 19:28:47 time: 1.9597 data_time: 0.0163 memory: 6645 grad_norm: 7.9883 loss: 2.2358 decode.loss_ce: 1.5840 decode.acc_seg: 69.8940 aux.loss_ce: 0.6518 aux.acc_seg: 72.0484 2024/10/24 21:49:54 - mmengine - INFO - Iter(train) [ 900/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 19:25:54 time: 1.9555 data_time: 0.0161 memory: 6643 grad_norm: 7.9283 loss: 2.3677 decode.loss_ce: 1.6707 decode.acc_seg: 45.9333 aux.loss_ce: 0.6971 aux.acc_seg: 44.8699 2024/10/24 21:51:32 - mmengine - INFO - Iter(train) [ 950/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 19:23:15 time: 1.9591 data_time: 0.0166 memory: 6644 grad_norm: 5.5163 loss: 2.0531 decode.loss_ce: 1.4397 decode.acc_seg: 62.8820 aux.loss_ce: 0.6133 aux.acc_seg: 56.2750 2024/10/24 21:53:10 - mmengine - INFO - Exp name: deeplabv3_mobilemamba_b4-80k_ade20k-512x512_20241024_212005 2024/10/24 21:53:10 - mmengine - INFO - Iter(train) [ 1000/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 19:20:41 time: 1.9593 data_time: 0.0161 memory: 6643 grad_norm: 6.6598 loss: 1.8859 decode.loss_ce: 1.3274 decode.acc_seg: 69.1468 aux.loss_ce: 0.5585 aux.acc_seg: 72.4730 2024/10/24 21:54:49 - mmengine - INFO - Iter(train) [ 1050/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 19:18:23 time: 1.9598 data_time: 0.0167 memory: 6645 grad_norm: 5.7744 loss: 1.8190 decode.loss_ce: 1.2808 decode.acc_seg: 64.2969 aux.loss_ce: 0.5382 aux.acc_seg: 57.4179 2024/10/24 21:56:27 - mmengine - INFO - Iter(train) [ 1100/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 19:15:58 time: 1.9580 data_time: 0.0163 memory: 6644 grad_norm: 8.2228 loss: 1.9750 decode.loss_ce: 1.3872 decode.acc_seg: 62.5027 aux.loss_ce: 0.5878 aux.acc_seg: 61.6455 2024/10/24 21:58:05 - mmengine - INFO - Iter(train) [ 1150/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 19:13:37 time: 1.9686 data_time: 0.0164 memory: 6644 grad_norm: 6.6011 loss: 1.8560 decode.loss_ce: 1.3007 decode.acc_seg: 62.1903 aux.loss_ce: 0.5554 aux.acc_seg: 58.9654 2024/10/24 21:59:44 - mmengine - INFO - Iter(train) [ 1200/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 19:12:13 time: 1.9588 data_time: 0.0158 memory: 6645 grad_norm: 6.9876 loss: 1.8768 decode.loss_ce: 1.3239 decode.acc_seg: 64.9271 aux.loss_ce: 0.5530 aux.acc_seg: 63.8752 2024/10/24 22:01:22 - mmengine - INFO - Iter(train) [ 1250/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 19:09:51 time: 1.9600 data_time: 0.0169 memory: 6644 grad_norm: 6.7770 loss: 1.8255 decode.loss_ce: 1.2867 decode.acc_seg: 58.6739 aux.loss_ce: 0.5388 aux.acc_seg: 53.9892 2024/10/24 22:03:00 - mmengine - INFO - Iter(train) [ 1300/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 19:07:55 time: 1.9788 data_time: 0.0164 memory: 6644 grad_norm: 5.2109 loss: 2.0620 decode.loss_ce: 1.4420 decode.acc_seg: 61.2199 aux.loss_ce: 0.6200 aux.acc_seg: 60.7720 2024/10/24 22:04:38 - mmengine - INFO - Iter(train) [ 1350/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 19:05:48 time: 1.9578 data_time: 0.0162 memory: 6643 grad_norm: 5.7423 loss: 1.7033 decode.loss_ce: 1.1981 decode.acc_seg: 71.8589 aux.loss_ce: 0.5052 aux.acc_seg: 70.0480 2024/10/24 22:06:16 - mmengine - INFO - Iter(train) [ 1400/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 19:03:44 time: 1.9624 data_time: 0.0161 memory: 6644 grad_norm: 8.5628 loss: 1.6153 decode.loss_ce: 1.1269 decode.acc_seg: 76.6113 aux.loss_ce: 0.4884 aux.acc_seg: 73.8811 2024/10/24 22:07:54 - mmengine - INFO - Iter(train) [ 1450/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 19:01:30 time: 1.9593 data_time: 0.0159 memory: 6643 grad_norm: 8.8335 loss: 1.7693 decode.loss_ce: 1.2299 decode.acc_seg: 64.1270 aux.loss_ce: 0.5394 aux.acc_seg: 65.2504 2024/10/24 22:09:32 - mmengine - INFO - Iter(train) [ 1500/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 18:59:15 time: 1.9546 data_time: 0.0159 memory: 6643 grad_norm: 5.4985 loss: 1.7249 decode.loss_ce: 1.2119 decode.acc_seg: 54.1152 aux.loss_ce: 0.5130 aux.acc_seg: 56.1410 2024/10/24 22:11:10 - mmengine - INFO - Iter(train) [ 1550/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 18:57:06 time: 1.9591 data_time: 0.0167 memory: 6643 grad_norm: 9.7640 loss: 1.5007 decode.loss_ce: 1.0547 decode.acc_seg: 70.4383 aux.loss_ce: 0.4460 aux.acc_seg: 71.6450 2024/10/24 22:12:48 - mmengine - INFO - Iter(train) [ 1600/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 18:54:55 time: 1.9528 data_time: 0.0163 memory: 6643 grad_norm: 7.5189 loss: 1.7430 decode.loss_ce: 1.2193 decode.acc_seg: 66.5783 aux.loss_ce: 0.5237 aux.acc_seg: 67.8757 2024/10/24 22:14:26 - mmengine - INFO - Iter(train) [ 1650/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 18:52:41 time: 1.9549 data_time: 0.0163 memory: 6643 grad_norm: 6.3813 loss: 1.4349 decode.loss_ce: 1.0038 decode.acc_seg: 72.9184 aux.loss_ce: 0.4311 aux.acc_seg: 72.5376 2024/10/24 22:16:04 - mmengine - INFO - Iter(train) [ 1700/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 18:50:34 time: 1.9559 data_time: 0.0162 memory: 6644 grad_norm: 6.5987 loss: 1.6373 decode.loss_ce: 1.1499 decode.acc_seg: 49.0846 aux.loss_ce: 0.4874 aux.acc_seg: 45.6249 2024/10/24 22:17:43 - mmengine - INFO - Iter(train) [ 1750/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 18:49:05 time: 1.9573 data_time: 0.0165 memory: 6645 grad_norm: 6.9929 loss: 1.4036 decode.loss_ce: 0.9799 decode.acc_seg: 74.7000 aux.loss_ce: 0.4237 aux.acc_seg: 75.6526 2024/10/24 22:19:20 - mmengine - INFO - Iter(train) [ 1800/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 18:46:57 time: 1.9564 data_time: 0.0163 memory: 6645 grad_norm: 5.3716 loss: 1.6000 decode.loss_ce: 1.1208 decode.acc_seg: 63.4004 aux.loss_ce: 0.4791 aux.acc_seg: 61.1401 2024/10/24 22:20:58 - mmengine - INFO - Iter(train) [ 1850/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 18:44:57 time: 1.9537 data_time: 0.0160 memory: 6643 grad_norm: 5.1121 loss: 1.6561 decode.loss_ce: 1.1564 decode.acc_seg: 62.0115 aux.loss_ce: 0.4997 aux.acc_seg: 58.4612 2024/10/24 22:22:36 - mmengine - INFO - Iter(train) [ 1900/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 18:43:04 time: 1.9774 data_time: 0.0162 memory: 6643 grad_norm: 5.6404 loss: 1.5225 decode.loss_ce: 1.0673 decode.acc_seg: 74.3073 aux.loss_ce: 0.4552 aux.acc_seg: 69.7333 2024/10/24 22:24:14 - mmengine - INFO - Iter(train) [ 1950/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 18:41:07 time: 1.9554 data_time: 0.0163 memory: 6644 grad_norm: 6.9735 loss: 1.6253 decode.loss_ce: 1.1317 decode.acc_seg: 66.7488 aux.loss_ce: 0.4936 aux.acc_seg: 58.3348 2024/10/24 22:25:52 - mmengine - INFO - Exp name: deeplabv3_mobilemamba_b4-80k_ade20k-512x512_20241024_212005 2024/10/24 22:25:52 - mmengine - INFO - Iter(train) [ 2000/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 18:39:04 time: 1.9540 data_time: 0.0161 memory: 6644 grad_norm: 6.0568 loss: 1.7986 decode.loss_ce: 1.2506 decode.acc_seg: 58.7231 aux.loss_ce: 0.5480 aux.acc_seg: 62.9154 2024/10/24 22:27:30 - mmengine - INFO - Iter(train) [ 2050/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 18:37:03 time: 1.9665 data_time: 0.0162 memory: 6643 grad_norm: 5.0583 loss: 1.4853 decode.loss_ce: 1.0176 decode.acc_seg: 75.3109 aux.loss_ce: 0.4677 aux.acc_seg: 71.9331 2024/10/24 22:29:08 - mmengine - INFO - Iter(train) [ 2100/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 18:34:56 time: 1.9534 data_time: 0.0161 memory: 6644 grad_norm: 4.9237 loss: 1.3155 decode.loss_ce: 0.9064 decode.acc_seg: 61.5741 aux.loss_ce: 0.4091 aux.acc_seg: 59.4323 2024/10/24 22:30:45 - mmengine - INFO - Iter(train) [ 2150/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 18:32:51 time: 1.9523 data_time: 0.0163 memory: 6644 grad_norm: 6.5113 loss: 1.6157 decode.loss_ce: 1.1241 decode.acc_seg: 65.1743 aux.loss_ce: 0.4915 aux.acc_seg: 61.0491 2024/10/24 22:32:23 - mmengine - INFO - Iter(train) [ 2200/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 18:30:52 time: 1.9569 data_time: 0.0166 memory: 6645 grad_norm: 6.9439 loss: 1.8625 decode.loss_ce: 1.3054 decode.acc_seg: 67.3717 aux.loss_ce: 0.5572 aux.acc_seg: 65.6358 2024/10/24 22:34:01 - mmengine - INFO - Iter(train) [ 2250/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 18:28:50 time: 1.9558 data_time: 0.0166 memory: 6645 grad_norm: 5.2878 loss: 1.4524 decode.loss_ce: 1.0098 decode.acc_seg: 70.4488 aux.loss_ce: 0.4425 aux.acc_seg: 70.7049 2024/10/24 22:35:39 - mmengine - INFO - Iter(train) [ 2300/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 18:26:50 time: 1.9525 data_time: 0.0163 memory: 6645 grad_norm: 6.2534 loss: 1.4487 decode.loss_ce: 0.9923 decode.acc_seg: 70.8963 aux.loss_ce: 0.4564 aux.acc_seg: 69.2083 2024/10/24 22:37:17 - mmengine - INFO - Iter(train) [ 2350/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 18:24:59 time: 1.9513 data_time: 0.0161 memory: 6643 grad_norm: 8.9825 loss: 1.4905 decode.loss_ce: 1.0332 decode.acc_seg: 62.0307 aux.loss_ce: 0.4573 aux.acc_seg: 65.1022 2024/10/24 22:38:54 - mmengine - INFO - Iter(train) [ 2400/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 18:23:01 time: 1.9527 data_time: 0.0162 memory: 6644 grad_norm: 6.1178 loss: 1.5442 decode.loss_ce: 1.0718 decode.acc_seg: 64.8092 aux.loss_ce: 0.4725 aux.acc_seg: 60.8989 2024/10/24 22:40:32 - mmengine - INFO - Iter(train) [ 2450/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 18:21:13 time: 1.9498 data_time: 0.0157 memory: 6643 grad_norm: 6.0818 loss: 1.3763 decode.loss_ce: 0.9622 decode.acc_seg: 63.3229 aux.loss_ce: 0.4141 aux.acc_seg: 58.5617 2024/10/24 22:42:10 - mmengine - INFO - Iter(train) [ 2500/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 18:19:13 time: 1.9552 data_time: 0.0166 memory: 6644 grad_norm: 6.9810 loss: 1.5130 decode.loss_ce: 1.0460 decode.acc_seg: 70.2835 aux.loss_ce: 0.4671 aux.acc_seg: 69.0180 2024/10/24 22:43:03 - mmengine - INFO - Exp name: deeplabv3_mobilemamba_b4-80k_ade20k-512x512_20241024_212005 2024/10/24 22:43:48 - mmengine - INFO - Iter(train) [ 2550/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 18:17:15 time: 1.9547 data_time: 0.0172 memory: 6645 grad_norm: 5.4595 loss: 1.4323 decode.loss_ce: 0.9887 decode.acc_seg: 71.4384 aux.loss_ce: 0.4436 aux.acc_seg: 70.4652 2024/10/24 22:45:25 - mmengine - INFO - Iter(train) [ 2600/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 18:15:22 time: 1.9540 data_time: 0.0157 memory: 6643 grad_norm: 6.9353 loss: 1.4288 decode.loss_ce: 0.9849 decode.acc_seg: 82.1113 aux.loss_ce: 0.4439 aux.acc_seg: 77.2691 2024/10/24 22:47:03 - mmengine - INFO - Iter(train) [ 2650/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 18:13:27 time: 1.9683 data_time: 0.0164 memory: 6645 grad_norm: 4.8959 loss: 1.4304 decode.loss_ce: 0.9823 decode.acc_seg: 60.9268 aux.loss_ce: 0.4481 aux.acc_seg: 55.8375 2024/10/24 22:48:44 - mmengine - INFO - Iter(train) [ 2700/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 18:12:58 time: 1.9506 data_time: 0.0163 memory: 6643 grad_norm: 5.1774 loss: 1.4123 decode.loss_ce: 0.9884 decode.acc_seg: 75.4122 aux.loss_ce: 0.4240 aux.acc_seg: 79.1249 2024/10/24 22:50:21 - mmengine - INFO - Iter(train) [ 2750/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 18:11:06 time: 1.9506 data_time: 0.0157 memory: 6643 grad_norm: 6.5345 loss: 1.3753 decode.loss_ce: 0.9582 decode.acc_seg: 69.5480 aux.loss_ce: 0.4171 aux.acc_seg: 65.7991 2024/10/24 22:51:59 - mmengine - INFO - Iter(train) [ 2800/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 18:09:09 time: 1.9568 data_time: 0.0157 memory: 6643 grad_norm: 4.6727 loss: 1.5506 decode.loss_ce: 1.0748 decode.acc_seg: 74.6370 aux.loss_ce: 0.4757 aux.acc_seg: 68.9400 2024/10/24 22:53:38 - mmengine - INFO - Iter(train) [ 2850/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 18:07:40 time: 1.9729 data_time: 0.0150 memory: 6644 grad_norm: 5.0916 loss: 1.4363 decode.loss_ce: 0.9887 decode.acc_seg: 67.4658 aux.loss_ce: 0.4476 aux.acc_seg: 68.3974 2024/10/24 22:55:16 - mmengine - INFO - Iter(train) [ 2900/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 18:06:10 time: 1.9734 data_time: 0.0150 memory: 6644 grad_norm: 5.6762 loss: 1.4163 decode.loss_ce: 0.9676 decode.acc_seg: 78.8433 aux.loss_ce: 0.4487 aux.acc_seg: 77.4854 2024/10/24 22:56:55 - mmengine - INFO - Iter(train) [ 2950/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 18:04:39 time: 1.9716 data_time: 0.0151 memory: 6643 grad_norm: 9.4906 loss: 1.5154 decode.loss_ce: 1.0537 decode.acc_seg: 82.8343 aux.loss_ce: 0.4616 aux.acc_seg: 82.4847 2024/10/24 22:58:33 - mmengine - INFO - Exp name: deeplabv3_mobilemamba_b4-80k_ade20k-512x512_20241024_212005 2024/10/24 22:58:33 - mmengine - INFO - Iter(train) [ 3000/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 18:02:59 time: 1.9468 data_time: 0.0165 memory: 6645 grad_norm: 7.3525 loss: 1.5232 decode.loss_ce: 1.0486 decode.acc_seg: 53.8798 aux.loss_ce: 0.4746 aux.acc_seg: 45.9817 2024/10/24 23:00:11 - mmengine - INFO - Iter(train) [ 3050/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 18:00:59 time: 1.9474 data_time: 0.0162 memory: 6644 grad_norm: 5.9922 loss: 1.3255 decode.loss_ce: 0.9100 decode.acc_seg: 59.1693 aux.loss_ce: 0.4155 aux.acc_seg: 54.9050 2024/10/24 23:01:48 - mmengine - INFO - Iter(train) [ 3100/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 17:59:01 time: 1.9511 data_time: 0.0162 memory: 6644 grad_norm: 5.5026 loss: 1.3647 decode.loss_ce: 0.9325 decode.acc_seg: 67.3905 aux.loss_ce: 0.4322 aux.acc_seg: 68.6302 2024/10/24 23:03:26 - mmengine - INFO - Iter(train) [ 3150/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 17:57:03 time: 1.9468 data_time: 0.0164 memory: 6643 grad_norm: 5.9169 loss: 1.4282 decode.loss_ce: 0.9947 decode.acc_seg: 74.4576 aux.loss_ce: 0.4336 aux.acc_seg: 73.5818 2024/10/24 23:05:03 - mmengine - INFO - Iter(train) [ 3200/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 17:55:07 time: 1.9499 data_time: 0.0151 memory: 6644 grad_norm: 5.9433 loss: 1.3641 decode.loss_ce: 0.9534 decode.acc_seg: 53.7042 aux.loss_ce: 0.4107 aux.acc_seg: 54.7861 2024/10/24 23:06:43 - mmengine - INFO - Iter(train) [ 3250/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 17:54:08 time: 1.9462 data_time: 0.0159 memory: 6644 grad_norm: 6.3252 loss: 1.2760 decode.loss_ce: 0.8949 decode.acc_seg: 68.3013 aux.loss_ce: 0.3811 aux.acc_seg: 63.3471 2024/10/24 23:08:21 - mmengine - INFO - Iter(train) [ 3300/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 17:52:21 time: 1.9443 data_time: 0.0143 memory: 6645 grad_norm: 7.2013 loss: 1.3482 decode.loss_ce: 0.9293 decode.acc_seg: 68.4877 aux.loss_ce: 0.4190 aux.acc_seg: 62.5254 2024/10/24 23:09:59 - mmengine - INFO - Iter(train) [ 3350/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 17:50:32 time: 1.9435 data_time: 0.0160 memory: 6645 grad_norm: 7.9952 loss: 1.1850 decode.loss_ce: 0.8201 decode.acc_seg: 72.9295 aux.loss_ce: 0.3648 aux.acc_seg: 70.7385 2024/10/24 23:11:37 - mmengine - INFO - Iter(train) [ 3400/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 17:48:56 time: 1.9789 data_time: 0.0154 memory: 6645 grad_norm: 7.0132 loss: 1.6278 decode.loss_ce: 1.1215 decode.acc_seg: 82.0786 aux.loss_ce: 0.5063 aux.acc_seg: 82.5199 2024/10/24 23:13:15 - mmengine - INFO - Iter(train) [ 3450/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 17:47:15 time: 1.9550 data_time: 0.0155 memory: 6643 grad_norm: 8.5033 loss: 1.1844 decode.loss_ce: 0.8258 decode.acc_seg: 69.6126 aux.loss_ce: 0.3587 aux.acc_seg: 70.6228 2024/10/24 23:14:53 - mmengine - INFO - Iter(train) [ 3500/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 17:45:25 time: 1.9540 data_time: 0.0162 memory: 6643 grad_norm: 6.2935 loss: 1.4581 decode.loss_ce: 0.9976 decode.acc_seg: 69.7859 aux.loss_ce: 0.4604 aux.acc_seg: 64.6442 2024/10/24 23:16:31 - mmengine - INFO - Iter(train) [ 3550/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 17:43:36 time: 1.9564 data_time: 0.0154 memory: 6644 grad_norm: 5.8454 loss: 1.0484 decode.loss_ce: 0.7180 decode.acc_seg: 75.9324 aux.loss_ce: 0.3304 aux.acc_seg: 74.4765 2024/10/24 23:18:08 - mmengine - INFO - Iter(train) [ 3600/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 17:41:47 time: 1.9568 data_time: 0.0151 memory: 6643 grad_norm: 7.9593 loss: 1.3264 decode.loss_ce: 0.9079 decode.acc_seg: 62.1401 aux.loss_ce: 0.4185 aux.acc_seg: 59.6245 2024/10/24 23:19:47 - mmengine - INFO - Iter(train) [ 3650/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 17:40:06 time: 1.9581 data_time: 0.0151 memory: 6643 grad_norm: 6.9764 loss: 1.2717 decode.loss_ce: 0.8859 decode.acc_seg: 56.3909 aux.loss_ce: 0.3858 aux.acc_seg: 57.4419 2024/10/24 23:21:25 - mmengine - INFO - Iter(train) [ 3700/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 17:38:23 time: 1.9557 data_time: 0.0148 memory: 6644 grad_norm: 7.3214 loss: 1.3125 decode.loss_ce: 0.9137 decode.acc_seg: 58.2520 aux.loss_ce: 0.3988 aux.acc_seg: 57.7868 2024/10/24 23:23:02 - mmengine - INFO - Iter(train) [ 3750/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 17:36:36 time: 1.9533 data_time: 0.0148 memory: 6645 grad_norm: 7.2818 loss: 1.2805 decode.loss_ce: 0.8934 decode.acc_seg: 62.4420 aux.loss_ce: 0.3871 aux.acc_seg: 63.0439 2024/10/24 23:24:43 - mmengine - INFO - Iter(train) [ 3800/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 17:35:56 time: 1.9485 data_time: 0.0156 memory: 6643 grad_norm: 6.5743 loss: 1.2227 decode.loss_ce: 0.8143 decode.acc_seg: 75.6526 aux.loss_ce: 0.4083 aux.acc_seg: 69.1750 2024/10/24 23:26:22 - mmengine - INFO - Iter(train) [ 3850/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 17:34:14 time: 1.9500 data_time: 0.0149 memory: 6644 grad_norm: 6.5175 loss: 1.2953 decode.loss_ce: 0.8869 decode.acc_seg: 70.9629 aux.loss_ce: 0.4085 aux.acc_seg: 75.2977 2024/10/24 23:28:00 - mmengine - INFO - Iter(train) [ 3900/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 17:32:34 time: 1.9753 data_time: 0.0144 memory: 6643 grad_norm: 5.9998 loss: 1.0230 decode.loss_ce: 0.7073 decode.acc_seg: 72.0040 aux.loss_ce: 0.3157 aux.acc_seg: 74.3266 2024/10/24 23:29:38 - mmengine - INFO - Iter(train) [ 3950/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 17:30:47 time: 1.9500 data_time: 0.0162 memory: 6644 grad_norm: 8.8966 loss: 1.3906 decode.loss_ce: 0.9420 decode.acc_seg: 52.5214 aux.loss_ce: 0.4485 aux.acc_seg: 52.3738 2024/10/24 23:31:15 - mmengine - INFO - Exp name: deeplabv3_mobilemamba_b4-80k_ade20k-512x512_20241024_212005 2024/10/24 23:31:15 - mmengine - INFO - Iter(train) [ 4000/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 17:28:57 time: 1.9597 data_time: 0.0153 memory: 6644 grad_norm: 6.7530 loss: 1.2197 decode.loss_ce: 0.8362 decode.acc_seg: 71.3284 aux.loss_ce: 0.3835 aux.acc_seg: 69.9922 2024/10/24 23:32:54 - mmengine - INFO - Iter(train) [ 4050/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 17:27:24 time: 1.9524 data_time: 0.0167 memory: 6643 grad_norm: 6.0637 loss: 1.2986 decode.loss_ce: 0.9028 decode.acc_seg: 74.8524 aux.loss_ce: 0.3958 aux.acc_seg: 71.0128 2024/10/24 23:34:31 - mmengine - INFO - Iter(train) [ 4100/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 17:25:36 time: 1.9531 data_time: 0.0152 memory: 6644 grad_norm: 5.7276 loss: 1.2994 decode.loss_ce: 0.8843 decode.acc_seg: 56.8925 aux.loss_ce: 0.4152 aux.acc_seg: 53.0056 2024/10/24 23:36:09 - mmengine - INFO - Iter(train) [ 4150/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 17:23:51 time: 1.9576 data_time: 0.0155 memory: 6644 grad_norm: 7.2997 loss: 1.3870 decode.loss_ce: 0.9623 decode.acc_seg: 55.7351 aux.loss_ce: 0.4247 aux.acc_seg: 54.6011 2024/10/24 23:37:47 - mmengine - INFO - Iter(train) [ 4200/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 17:22:02 time: 1.9476 data_time: 0.0155 memory: 6643 grad_norm: 6.5326 loss: 1.2100 decode.loss_ce: 0.8451 decode.acc_seg: 75.9793 aux.loss_ce: 0.3649 aux.acc_seg: 75.1054 2024/10/24 23:39:25 - mmengine - INFO - Iter(train) [ 4250/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 17:20:18 time: 1.9510 data_time: 0.0154 memory: 6644 grad_norm: 7.3224 loss: 1.3632 decode.loss_ce: 0.9231 decode.acc_seg: 62.3679 aux.loss_ce: 0.4401 aux.acc_seg: 60.4023 2024/10/24 23:41:03 - mmengine - INFO - Iter(train) [ 4300/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 17:18:32 time: 1.9516 data_time: 0.0159 memory: 6643 grad_norm: 5.6875 loss: 1.1061 decode.loss_ce: 0.7541 decode.acc_seg: 74.1025 aux.loss_ce: 0.3520 aux.acc_seg: 69.1124 2024/10/24 23:42:43 - mmengine - INFO - Iter(train) [ 4350/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 17:17:35 time: 1.9493 data_time: 0.0153 memory: 6644 grad_norm: 7.1565 loss: 1.1605 decode.loss_ce: 0.7915 decode.acc_seg: 85.5202 aux.loss_ce: 0.3689 aux.acc_seg: 71.9859 2024/10/24 23:44:21 - mmengine - INFO - Iter(train) [ 4400/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 17:15:52 time: 1.9600 data_time: 0.0161 memory: 6643 grad_norm: 7.8735 loss: 1.2445 decode.loss_ce: 0.8645 decode.acc_seg: 76.6296 aux.loss_ce: 0.3800 aux.acc_seg: 75.1655 2024/10/24 23:45:59 - mmengine - INFO - Iter(train) [ 4450/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 17:14:05 time: 1.9517 data_time: 0.0167 memory: 6643 grad_norm: 5.4363 loss: 1.1037 decode.loss_ce: 0.7541 decode.acc_seg: 62.5456 aux.loss_ce: 0.3496 aux.acc_seg: 57.3260 2024/10/24 23:47:37 - mmengine - INFO - Iter(train) [ 4500/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 17:12:16 time: 1.9550 data_time: 0.0150 memory: 6643 grad_norm: 6.1956 loss: 1.0788 decode.loss_ce: 0.7282 decode.acc_seg: 68.5317 aux.loss_ce: 0.3506 aux.acc_seg: 69.7576 2024/10/24 23:49:14 - mmengine - INFO - Iter(train) [ 4550/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 17:10:31 time: 1.9638 data_time: 0.0149 memory: 6643 grad_norm: 6.5462 loss: 1.2160 decode.loss_ce: 0.8420 decode.acc_seg: 73.1134 aux.loss_ce: 0.3740 aux.acc_seg: 63.2299 2024/10/24 23:50:52 - mmengine - INFO - Iter(train) [ 4600/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 17:08:46 time: 1.9517 data_time: 0.0159 memory: 6643 grad_norm: 6.8350 loss: 1.1717 decode.loss_ce: 0.8112 decode.acc_seg: 82.5910 aux.loss_ce: 0.3604 aux.acc_seg: 82.5801 2024/10/24 23:52:31 - mmengine - INFO - Iter(train) [ 4650/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 17:07:17 time: 1.9526 data_time: 0.0163 memory: 6643 grad_norm: 6.5044 loss: 1.0321 decode.loss_ce: 0.6821 decode.acc_seg: 83.1763 aux.loss_ce: 0.3500 aux.acc_seg: 82.4870 2024/10/24 23:54:09 - mmengine - INFO - Iter(train) [ 4700/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 17:05:36 time: 1.9595 data_time: 0.0155 memory: 6643 grad_norm: 6.5230 loss: 1.2702 decode.loss_ce: 0.8674 decode.acc_seg: 77.6242 aux.loss_ce: 0.4028 aux.acc_seg: 66.7737 2024/10/24 23:55:48 - mmengine - INFO - Iter(train) [ 4750/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 17:04:03 time: 1.9594 data_time: 0.0165 memory: 6643 grad_norm: 6.2981 loss: 1.3068 decode.loss_ce: 0.8825 decode.acc_seg: 73.4525 aux.loss_ce: 0.4243 aux.acc_seg: 65.8808 2024/10/24 23:57:26 - mmengine - INFO - Iter(train) [ 4800/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 17:02:21 time: 1.9530 data_time: 0.0150 memory: 6644 grad_norm: 5.8485 loss: 1.3209 decode.loss_ce: 0.9111 decode.acc_seg: 62.0365 aux.loss_ce: 0.4098 aux.acc_seg: 60.6413 2024/10/24 23:59:04 - mmengine - INFO - Iter(train) [ 4850/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 17:00:43 time: 1.9533 data_time: 0.0151 memory: 6646 grad_norm: 6.7018 loss: 1.1693 decode.loss_ce: 0.8163 decode.acc_seg: 62.6470 aux.loss_ce: 0.3530 aux.acc_seg: 61.2001 2024/10/25 00:00:44 - mmengine - INFO - Iter(train) [ 4900/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 16:59:34 time: 1.9484 data_time: 0.0157 memory: 6644 grad_norm: 5.7819 loss: 1.1353 decode.loss_ce: 0.7716 decode.acc_seg: 60.3370 aux.loss_ce: 0.3637 aux.acc_seg: 63.9732 2024/10/25 00:02:22 - mmengine - INFO - Iter(train) [ 4950/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 16:57:48 time: 1.9560 data_time: 0.0161 memory: 6645 grad_norm: 6.9286 loss: 1.1468 decode.loss_ce: 0.7850 decode.acc_seg: 53.3377 aux.loss_ce: 0.3618 aux.acc_seg: 56.7224 2024/10/25 00:04:00 - mmengine - INFO - Exp name: deeplabv3_mobilemamba_b4-80k_ade20k-512x512_20241024_212005 2024/10/25 00:04:00 - mmengine - INFO - Iter(train) [ 5000/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 16:56:08 time: 1.9552 data_time: 0.0159 memory: 6644 grad_norm: 6.0041 loss: 1.0215 decode.loss_ce: 0.6986 decode.acc_seg: 62.6103 aux.loss_ce: 0.3229 aux.acc_seg: 70.2931 2024/10/25 00:05:38 - mmengine - INFO - Iter(train) [ 5050/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 16:54:27 time: 1.9568 data_time: 0.0164 memory: 6644 grad_norm: 5.8805 loss: 1.1621 decode.loss_ce: 0.7792 decode.acc_seg: 66.6603 aux.loss_ce: 0.3829 aux.acc_seg: 59.5563 2024/10/25 00:07:16 - mmengine - INFO - Iter(train) [ 5100/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 16:52:40 time: 1.9479 data_time: 0.0153 memory: 6643 grad_norm: 6.5756 loss: 1.0271 decode.loss_ce: 0.7156 decode.acc_seg: 66.6034 aux.loss_ce: 0.3115 aux.acc_seg: 70.6771 2024/10/25 00:08:53 - mmengine - INFO - Iter(train) [ 5150/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 16:50:53 time: 1.9440 data_time: 0.0161 memory: 6644 grad_norm: 6.2512 loss: 1.1528 decode.loss_ce: 0.7986 decode.acc_seg: 72.4064 aux.loss_ce: 0.3542 aux.acc_seg: 75.4089 2024/10/25 00:10:31 - mmengine - INFO - Iter(train) [ 5200/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 16:49:06 time: 1.9456 data_time: 0.0164 memory: 6644 grad_norm: 4.7909 loss: 1.1148 decode.loss_ce: 0.7742 decode.acc_seg: 79.4017 aux.loss_ce: 0.3406 aux.acc_seg: 77.8968 2024/10/25 00:12:09 - mmengine - INFO - Iter(train) [ 5250/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 16:47:22 time: 1.9463 data_time: 0.0170 memory: 6643 grad_norm: 5.7524 loss: 1.0160 decode.loss_ce: 0.6907 decode.acc_seg: 71.0942 aux.loss_ce: 0.3252 aux.acc_seg: 66.4258 2024/10/25 00:13:46 - mmengine - INFO - Iter(train) [ 5300/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 16:45:35 time: 1.9531 data_time: 0.0156 memory: 6643 grad_norm: 6.8819 loss: 1.1356 decode.loss_ce: 0.7698 decode.acc_seg: 61.7792 aux.loss_ce: 0.3658 aux.acc_seg: 58.5939 2024/10/25 00:15:24 - mmengine - INFO - Iter(train) [ 5350/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 16:43:47 time: 1.9491 data_time: 0.0157 memory: 6645 grad_norm: 6.4697 loss: 1.2974 decode.loss_ce: 0.9002 decode.acc_seg: 70.8937 aux.loss_ce: 0.3972 aux.acc_seg: 64.8317 2024/10/25 00:17:02 - mmengine - INFO - Iter(train) [ 5400/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 16:42:11 time: 1.9564 data_time: 0.0157 memory: 6645 grad_norm: 6.9266 loss: 1.1649 decode.loss_ce: 0.7892 decode.acc_seg: 71.1096 aux.loss_ce: 0.3756 aux.acc_seg: 69.4556 2024/10/25 00:18:43 - mmengine - INFO - Iter(train) [ 5450/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 16:41:02 time: 1.9518 data_time: 0.0150 memory: 6644 grad_norm: 6.0594 loss: 1.3192 decode.loss_ce: 0.9117 decode.acc_seg: 55.4299 aux.loss_ce: 0.4075 aux.acc_seg: 56.3990 2024/10/25 00:20:20 - mmengine - INFO - Iter(train) [ 5500/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 16:39:14 time: 1.9510 data_time: 0.0157 memory: 6643 grad_norm: 5.1118 loss: 1.1741 decode.loss_ce: 0.8098 decode.acc_seg: 66.5518 aux.loss_ce: 0.3643 aux.acc_seg: 68.3384 2024/10/25 00:21:58 - mmengine - INFO - Iter(train) [ 5550/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 16:37:28 time: 1.9480 data_time: 0.0156 memory: 6643 grad_norm: 5.5996 loss: 1.0748 decode.loss_ce: 0.7484 decode.acc_seg: 69.9170 aux.loss_ce: 0.3265 aux.acc_seg: 68.3788 2024/10/25 00:23:35 - mmengine - INFO - Iter(train) [ 5600/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 16:35:42 time: 1.9527 data_time: 0.0157 memory: 6645 grad_norm: 6.7506 loss: 0.9912 decode.loss_ce: 0.6828 decode.acc_seg: 68.8985 aux.loss_ce: 0.3084 aux.acc_seg: 67.0571 2024/10/25 00:25:13 - mmengine - INFO - Iter(train) [ 5650/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 16:34:00 time: 1.9470 data_time: 0.0150 memory: 6643 grad_norm: 5.6143 loss: 1.0370 decode.loss_ce: 0.7081 decode.acc_seg: 80.4153 aux.loss_ce: 0.3290 aux.acc_seg: 77.0360 2024/10/25 00:26:51 - mmengine - INFO - Iter(train) [ 5700/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 16:32:17 time: 1.9539 data_time: 0.0151 memory: 6643 grad_norm: 6.0148 loss: 1.1513 decode.loss_ce: 0.7751 decode.acc_seg: 74.9409 aux.loss_ce: 0.3761 aux.acc_seg: 72.7071 2024/10/25 00:28:29 - mmengine - INFO - Iter(train) [ 5750/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 16:30:35 time: 1.9606 data_time: 0.0160 memory: 6644 grad_norm: 7.6954 loss: 1.2669 decode.loss_ce: 0.8857 decode.acc_seg: 66.6776 aux.loss_ce: 0.3811 aux.acc_seg: 61.5041 2024/10/25 00:30:07 - mmengine - INFO - Iter(train) [ 5800/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 16:28:53 time: 1.9531 data_time: 0.0155 memory: 6643 grad_norm: 7.9042 loss: 1.2744 decode.loss_ce: 0.8772 decode.acc_seg: 73.5586 aux.loss_ce: 0.3972 aux.acc_seg: 66.7636 2024/10/25 00:31:45 - mmengine - INFO - Iter(train) [ 5850/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 16:27:10 time: 1.9623 data_time: 0.0161 memory: 6643 grad_norm: 6.0817 loss: 1.2535 decode.loss_ce: 0.8574 decode.acc_seg: 84.8283 aux.loss_ce: 0.3962 aux.acc_seg: 85.5660 2024/10/25 00:33:22 - mmengine - INFO - Iter(train) [ 5900/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 16:25:23 time: 1.9488 data_time: 0.0147 memory: 6643 grad_norm: 5.0424 loss: 1.1205 decode.loss_ce: 0.7753 decode.acc_seg: 76.5661 aux.loss_ce: 0.3452 aux.acc_seg: 76.6351 2024/10/25 00:35:00 - mmengine - INFO - Iter(train) [ 5950/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 16:23:35 time: 1.9476 data_time: 0.0158 memory: 6643 grad_norm: 6.0878 loss: 1.1554 decode.loss_ce: 0.7723 decode.acc_seg: 70.4076 aux.loss_ce: 0.3830 aux.acc_seg: 67.5313 2024/10/25 00:36:38 - mmengine - INFO - Exp name: deeplabv3_mobilemamba_b4-80k_ade20k-512x512_20241024_212005 2024/10/25 00:36:38 - mmengine - INFO - Iter(train) [ 6000/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 16:21:57 time: 1.9537 data_time: 0.0149 memory: 6643 grad_norm: 7.4294 loss: 1.0491 decode.loss_ce: 0.7039 decode.acc_seg: 75.5182 aux.loss_ce: 0.3452 aux.acc_seg: 68.8917 2024/10/25 00:38:16 - mmengine - INFO - Iter(train) [ 6050/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 16:20:13 time: 1.9584 data_time: 0.0151 memory: 6645 grad_norm: 6.5284 loss: 1.0336 decode.loss_ce: 0.7067 decode.acc_seg: 81.2818 aux.loss_ce: 0.3269 aux.acc_seg: 76.0477 2024/10/25 00:39:54 - mmengine - INFO - Iter(train) [ 6100/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 16:18:35 time: 1.9726 data_time: 0.0139 memory: 6643 grad_norm: 6.2290 loss: 1.4215 decode.loss_ce: 0.9579 decode.acc_seg: 72.1570 aux.loss_ce: 0.4636 aux.acc_seg: 63.7803 2024/10/25 00:41:32 - mmengine - INFO - Iter(train) [ 6150/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 16:16:55 time: 1.9467 data_time: 0.0158 memory: 6644 grad_norm: 5.7285 loss: 1.0315 decode.loss_ce: 0.6974 decode.acc_seg: 75.1544 aux.loss_ce: 0.3340 aux.acc_seg: 76.0504 2024/10/25 00:43:10 - mmengine - INFO - Iter(train) [ 6200/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 16:15:15 time: 1.9569 data_time: 0.0153 memory: 6644 grad_norm: 7.6892 loss: 0.9345 decode.loss_ce: 0.6355 decode.acc_seg: 75.2343 aux.loss_ce: 0.2990 aux.acc_seg: 72.9431 2024/10/25 00:44:48 - mmengine - INFO - Iter(train) [ 6250/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 16:13:33 time: 1.9575 data_time: 0.0157 memory: 6645 grad_norm: 6.4666 loss: 1.1656 decode.loss_ce: 0.8108 decode.acc_seg: 64.2169 aux.loss_ce: 0.3548 aux.acc_seg: 60.8977 2024/10/25 00:46:25 - mmengine - INFO - Iter(train) [ 6300/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 16:11:49 time: 1.9494 data_time: 0.0156 memory: 6644 grad_norm: 7.1360 loss: 1.1773 decode.loss_ce: 0.8050 decode.acc_seg: 73.5405 aux.loss_ce: 0.3723 aux.acc_seg: 73.0920 2024/10/25 00:48:03 - mmengine - INFO - Iter(train) [ 6350/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 16:10:07 time: 1.9485 data_time: 0.0158 memory: 6645 grad_norm: 6.7517 loss: 1.1494 decode.loss_ce: 0.7872 decode.acc_seg: 74.0762 aux.loss_ce: 0.3622 aux.acc_seg: 68.8417 2024/10/25 00:49:44 - mmengine - INFO - Iter(train) [ 6400/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 16:09:02 time: 1.9602 data_time: 0.0160 memory: 6643 grad_norm: 7.1278 loss: 1.0933 decode.loss_ce: 0.7609 decode.acc_seg: 62.0089 aux.loss_ce: 0.3324 aux.acc_seg: 64.7422 2024/10/25 00:51:22 - mmengine - INFO - Iter(train) [ 6450/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 16:07:19 time: 1.9530 data_time: 0.0153 memory: 6645 grad_norm: 6.6387 loss: 1.1075 decode.loss_ce: 0.7745 decode.acc_seg: 49.0820 aux.loss_ce: 0.3329 aux.acc_seg: 50.1226 2024/10/25 00:53:00 - mmengine - INFO - Iter(train) [ 6500/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 16:05:35 time: 1.9510 data_time: 0.0166 memory: 6643 grad_norm: 6.2747 loss: 1.1781 decode.loss_ce: 0.8071 decode.acc_seg: 66.1779 aux.loss_ce: 0.3710 aux.acc_seg: 64.4796 2024/10/25 00:54:37 - mmengine - INFO - Iter(train) [ 6550/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 16:03:51 time: 1.9580 data_time: 0.0168 memory: 6643 grad_norm: 7.2354 loss: 1.2753 decode.loss_ce: 0.8666 decode.acc_seg: 63.3649 aux.loss_ce: 0.4086 aux.acc_seg: 58.9176 2024/10/25 00:56:15 - mmengine - INFO - Iter(train) [ 6600/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 16:02:07 time: 1.9638 data_time: 0.0163 memory: 6643 grad_norm: 6.4386 loss: 1.1736 decode.loss_ce: 0.7999 decode.acc_seg: 69.2077 aux.loss_ce: 0.3737 aux.acc_seg: 62.5795 2024/10/25 00:57:53 - mmengine - INFO - Iter(train) [ 6650/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 16:00:27 time: 1.9497 data_time: 0.0159 memory: 6645 grad_norm: 7.5157 loss: 1.1303 decode.loss_ce: 0.7770 decode.acc_seg: 73.3019 aux.loss_ce: 0.3533 aux.acc_seg: 68.6694 2024/10/25 00:59:31 - mmengine - INFO - Iter(train) [ 6700/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 15:58:43 time: 1.9513 data_time: 0.0161 memory: 6643 grad_norm: 5.7766 loss: 1.1399 decode.loss_ce: 0.7799 decode.acc_seg: 71.9364 aux.loss_ce: 0.3600 aux.acc_seg: 67.0347 2024/10/25 01:01:09 - mmengine - INFO - Iter(train) [ 6750/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 15:57:05 time: 1.9490 data_time: 0.0155 memory: 6643 grad_norm: 7.0135 loss: 1.1181 decode.loss_ce: 0.7734 decode.acc_seg: 68.0430 aux.loss_ce: 0.3447 aux.acc_seg: 66.1563 2024/10/25 01:02:47 - mmengine - INFO - Iter(train) [ 6800/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 15:55:21 time: 1.9442 data_time: 0.0155 memory: 6645 grad_norm: 5.5925 loss: 0.9990 decode.loss_ce: 0.6971 decode.acc_seg: 86.8795 aux.loss_ce: 0.3019 aux.acc_seg: 83.8711 2024/10/25 01:04:24 - mmengine - INFO - Iter(train) [ 6850/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 15:53:38 time: 1.9523 data_time: 0.0152 memory: 6645 grad_norm: 7.3671 loss: 1.1682 decode.loss_ce: 0.8081 decode.acc_seg: 64.1055 aux.loss_ce: 0.3601 aux.acc_seg: 58.0672 2024/10/25 01:06:02 - mmengine - INFO - Iter(train) [ 6900/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 15:51:53 time: 1.9473 data_time: 0.0160 memory: 6645 grad_norm: 5.5218 loss: 1.0827 decode.loss_ce: 0.7350 decode.acc_seg: 73.9815 aux.loss_ce: 0.3476 aux.acc_seg: 73.1762 2024/10/25 01:07:43 - mmengine - INFO - Iter(train) [ 6950/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 15:50:43 time: 1.9474 data_time: 0.0148 memory: 6643 grad_norm: 5.9990 loss: 1.2962 decode.loss_ce: 0.9011 decode.acc_seg: 72.0876 aux.loss_ce: 0.3951 aux.acc_seg: 70.0045 2024/10/25 01:09:20 - mmengine - INFO - Exp name: deeplabv3_mobilemamba_b4-80k_ade20k-512x512_20241024_212005 2024/10/25 01:09:20 - mmengine - INFO - Iter(train) [ 7000/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 15:49:00 time: 1.9502 data_time: 0.0162 memory: 6643 grad_norm: 6.8179 loss: 1.1005 decode.loss_ce: 0.7648 decode.acc_seg: 74.7844 aux.loss_ce: 0.3357 aux.acc_seg: 72.4937 2024/10/25 01:10:58 - mmengine - INFO - Iter(train) [ 7050/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 15:47:20 time: 1.9535 data_time: 0.0147 memory: 6644 grad_norm: 7.2591 loss: 1.0994 decode.loss_ce: 0.7428 decode.acc_seg: 65.7253 aux.loss_ce: 0.3566 aux.acc_seg: 64.8432 2024/10/25 01:12:36 - mmengine - INFO - Iter(train) [ 7100/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 15:45:36 time: 1.9464 data_time: 0.0153 memory: 6644 grad_norm: 6.7567 loss: 1.1642 decode.loss_ce: 0.7801 decode.acc_seg: 71.6149 aux.loss_ce: 0.3841 aux.acc_seg: 60.3563 2024/10/25 01:14:14 - mmengine - INFO - Iter(train) [ 7150/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 15:43:55 time: 1.9662 data_time: 0.0150 memory: 6645 grad_norm: 6.4652 loss: 0.9041 decode.loss_ce: 0.6140 decode.acc_seg: 84.7464 aux.loss_ce: 0.2900 aux.acc_seg: 84.1376 2024/10/25 01:15:51 - mmengine - INFO - Iter(train) [ 7200/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 15:42:09 time: 1.9464 data_time: 0.0154 memory: 6644 grad_norm: 5.7393 loss: 0.9069 decode.loss_ce: 0.6262 decode.acc_seg: 81.5565 aux.loss_ce: 0.2808 aux.acc_seg: 81.6183 2024/10/25 01:17:29 - mmengine - INFO - Iter(train) [ 7250/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 15:40:24 time: 1.9474 data_time: 0.0149 memory: 6643 grad_norm: 5.9189 loss: 1.0767 decode.loss_ce: 0.7221 decode.acc_seg: 64.8094 aux.loss_ce: 0.3546 aux.acc_seg: 59.0408 2024/10/25 01:19:07 - mmengine - INFO - Iter(train) [ 7300/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 15:38:45 time: 1.9470 data_time: 0.0161 memory: 6645 grad_norm: 6.7241 loss: 1.0531 decode.loss_ce: 0.7106 decode.acc_seg: 73.1590 aux.loss_ce: 0.3425 aux.acc_seg: 70.5288 2024/10/25 01:20:45 - mmengine - INFO - Iter(train) [ 7350/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 15:37:05 time: 1.9551 data_time: 0.0160 memory: 6645 grad_norm: 5.9314 loss: 1.1434 decode.loss_ce: 0.7736 decode.acc_seg: 72.6336 aux.loss_ce: 0.3697 aux.acc_seg: 59.3870 2024/10/25 01:22:22 - mmengine - INFO - Iter(train) [ 7400/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 15:35:19 time: 1.9451 data_time: 0.0145 memory: 6644 grad_norm: 6.2528 loss: 1.0814 decode.loss_ce: 0.7268 decode.acc_seg: 78.2970 aux.loss_ce: 0.3546 aux.acc_seg: 76.3675 2024/10/25 01:24:00 - mmengine - INFO - Iter(train) [ 7450/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 15:33:34 time: 1.9466 data_time: 0.0167 memory: 6645 grad_norm: 6.7564 loss: 0.8658 decode.loss_ce: 0.5897 decode.acc_seg: 83.1758 aux.loss_ce: 0.2762 aux.acc_seg: 80.1510 2024/10/25 01:25:38 - mmengine - INFO - Iter(train) [ 7500/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 15:31:53 time: 1.9481 data_time: 0.0168 memory: 6645 grad_norm: 6.8246 loss: 1.1348 decode.loss_ce: 0.7618 decode.acc_seg: 81.2996 aux.loss_ce: 0.3730 aux.acc_seg: 77.3087 2024/10/25 01:27:15 - mmengine - INFO - Iter(train) [ 7550/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 15:30:09 time: 1.9523 data_time: 0.0165 memory: 6644 grad_norm: 5.4947 loss: 1.1482 decode.loss_ce: 0.7891 decode.acc_seg: 77.4072 aux.loss_ce: 0.3591 aux.acc_seg: 73.9804 2024/10/25 01:28:53 - mmengine - INFO - Iter(train) [ 7600/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 15:28:29 time: 1.9499 data_time: 0.0167 memory: 6645 grad_norm: 5.9200 loss: 1.0399 decode.loss_ce: 0.7128 decode.acc_seg: 73.5657 aux.loss_ce: 0.3271 aux.acc_seg: 69.9587 2024/10/25 01:30:31 - mmengine - INFO - Iter(train) [ 7650/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 15:26:47 time: 1.9423 data_time: 0.0163 memory: 6643 grad_norm: 5.6353 loss: 1.1544 decode.loss_ce: 0.7683 decode.acc_seg: 80.4572 aux.loss_ce: 0.3861 aux.acc_seg: 74.7644 2024/10/25 01:32:09 - mmengine - INFO - Iter(train) [ 7700/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 15:25:07 time: 1.9527 data_time: 0.0153 memory: 6645 grad_norm: 6.0633 loss: 0.8091 decode.loss_ce: 0.5432 decode.acc_seg: 89.7166 aux.loss_ce: 0.2659 aux.acc_seg: 69.6771 2024/10/25 01:33:46 - mmengine - INFO - Iter(train) [ 7750/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 15:23:25 time: 1.9518 data_time: 0.0156 memory: 6643 grad_norm: 7.3503 loss: 1.0212 decode.loss_ce: 0.7020 decode.acc_seg: 60.8607 aux.loss_ce: 0.3192 aux.acc_seg: 60.0843 2024/10/25 01:35:24 - mmengine - INFO - Iter(train) [ 7800/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 15:21:41 time: 1.9489 data_time: 0.0160 memory: 6643 grad_norm: 6.7842 loss: 1.1805 decode.loss_ce: 0.7903 decode.acc_seg: 58.2767 aux.loss_ce: 0.3902 aux.acc_seg: 55.8628 2024/10/25 01:37:01 - mmengine - INFO - Iter(train) [ 7850/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 15:19:57 time: 1.9452 data_time: 0.0157 memory: 6645 grad_norm: 5.5906 loss: 0.9408 decode.loss_ce: 0.6275 decode.acc_seg: 81.8984 aux.loss_ce: 0.3133 aux.acc_seg: 81.2037 2024/10/25 01:38:42 - mmengine - INFO - Iter(train) [ 7900/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 15:18:37 time: 1.9455 data_time: 0.0158 memory: 6644 grad_norm: 6.1982 loss: 0.9652 decode.loss_ce: 0.6472 decode.acc_seg: 70.7150 aux.loss_ce: 0.3180 aux.acc_seg: 70.6738 2024/10/25 01:40:19 - mmengine - INFO - Iter(train) [ 7950/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 15:16:55 time: 1.9568 data_time: 0.0152 memory: 6644 grad_norm: 8.5602 loss: 1.0072 decode.loss_ce: 0.6692 decode.acc_seg: 75.2312 aux.loss_ce: 0.3380 aux.acc_seg: 65.1477 2024/10/25 01:41:57 - mmengine - INFO - Exp name: deeplabv3_mobilemamba_b4-80k_ade20k-512x512_20241024_212005 2024/10/25 01:41:57 - mmengine - INFO - Iter(train) [ 8000/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 15:15:14 time: 1.9487 data_time: 0.0143 memory: 6644 grad_norm: 4.8109 loss: 1.0428 decode.loss_ce: 0.7006 decode.acc_seg: 64.8858 aux.loss_ce: 0.3422 aux.acc_seg: 57.9869 2024/10/25 01:41:57 - mmengine - INFO - Saving checkpoint at 8000 iterations 2024/10/25 01:42:36 - mmengine - INFO - Iter(val) [ 50/500] eta: 0:05:19 time: 0.1404 data_time: 0.0017 memory: 20735 2024/10/25 01:42:50 - mmengine - INFO - Iter(val) [100/500] eta: 0:03:19 time: 0.0451 data_time: 0.0018 memory: 20736 2024/10/25 01:42:54 - mmengine - INFO - Iter(val) [150/500] eta: 0:02:05 time: 0.0298 data_time: 0.0016 memory: 5551 2024/10/25 01:43:01 - mmengine - INFO - Iter(val) [200/500] eta: 0:01:30 time: 0.0420 data_time: 0.0018 memory: 5554 2024/10/25 01:43:04 - mmengine - INFO - Iter(val) [250/500] eta: 0:01:03 time: 0.0439 data_time: 0.0019 memory: 5555 2024/10/25 01:43:08 - mmengine - INFO - Iter(val) [300/500] eta: 0:00:45 time: 0.0435 data_time: 0.0020 memory: 5556 2024/10/25 01:43:10 - mmengine - INFO - Iter(val) [350/500] eta: 0:00:29 time: 0.0469 data_time: 0.0016 memory: 834 2024/10/25 01:43:13 - mmengine - INFO - Iter(val) [400/500] eta: 0:00:18 time: 0.0310 data_time: 0.0015 memory: 5551 2024/10/25 01:43:16 - mmengine - INFO - Iter(val) [450/500] eta: 0:00:08 time: 0.0324 data_time: 0.0015 memory: 5552 2024/10/25 01:43:22 - mmengine - INFO - Iter(val) [500/500] eta: 0:00:00 time: 0.0973 data_time: 0.0019 memory: 20733 2024/10/25 01:43:33 - mmengine - INFO - per class results: 2024/10/25 01:43:33 - mmengine - INFO - +---------------------+-------+-------+ | Class | IoU | Acc | +---------------------+-------+-------+ | wall | 64.4 | 82.89 | | building | 75.07 | 89.08 | | sky | 87.34 | 94.25 | | floor | 67.64 | 84.44 | | tree | 64.41 | 80.59 | | ceiling | 73.22 | 84.11 | | road | 72.7 | 88.12 | | bed | 73.75 | 86.28 | | windowpane | 45.76 | 60.91 | | grass | 64.95 | 83.88 | | cabinet | 46.37 | 57.46 | | sidewalk | 45.12 | 60.3 | | person | 57.52 | 73.56 | | earth | 30.59 | 43.49 | | door | 26.03 | 37.45 | | table | 38.77 | 60.43 | | mountain | 52.32 | 75.68 | | plant | 41.92 | 52.58 | | curtain | 53.35 | 68.11 | | chair | 31.6 | 42.08 | | car | 63.79 | 74.06 | | water | 41.86 | 55.17 | | painting | 47.09 | 60.16 | | sofa | 50.11 | 72.91 | | shelf | 28.41 | 38.99 | | house | 37.12 | 57.97 | | sea | 44.74 | 73.35 | | mirror | 43.09 | 49.38 | | rug | 42.93 | 55.67 | | field | 24.31 | 36.13 | | armchair | 21.89 | 31.38 | | seat | 52.97 | 75.16 | | fence | 33.75 | 43.89 | | desk | 32.07 | 52.09 | | rock | 32.8 | 41.65 | | wardrobe | 38.14 | 63.81 | | lamp | 25.2 | 32.43 | | bathtub | 62.05 | 71.49 | | railing | 27.4 | 32.94 | | cushion | 25.97 | 35.86 | | base | 11.93 | 14.9 | | box | 12.83 | 17.62 | | column | 32.69 | 41.3 | | signboard | 16.94 | 25.32 | | chest of drawers | 36.38 | 54.11 | | counter | 20.36 | 24.57 | | sand | 25.44 | 45.04 | | sink | 42.15 | 50.85 | | skyscraper | 47.1 | 62.44 | | fireplace | 55.0 | 70.51 | | refrigerator | 42.42 | 76.94 | | grandstand | 31.64 | 73.24 | | path | 7.25 | 8.73 | | stairs | 22.57 | 28.74 | | runway | 74.03 | 90.91 | | case | 39.81 | 56.92 | | pool table | 51.56 | 55.39 | | pillow | 27.6 | 31.23 | | screen door | 54.65 | 66.86 | | stairway | 23.16 | 36.51 | | river | 14.55 | 27.56 | | bridge | 52.37 | 70.71 | | bookcase | 19.7 | 39.5 | | blind | 12.69 | 13.7 | | coffee table | 35.18 | 72.27 | | toilet | 58.32 | 73.18 | | flower | 20.61 | 27.83 | | book | 29.6 | 41.29 | | hill | 2.36 | 2.77 | | bench | 25.7 | 34.49 | | countertop | 32.12 | 40.79 | | stove | 46.74 | 69.22 | | palm | 33.23 | 51.68 | | kitchen island | 21.17 | 48.4 | | computer | 51.24 | 66.98 | | swivel chair | 24.13 | 33.55 | | boat | 34.51 | 65.26 | | bar | 20.52 | 25.21 | | arcade machine | 56.04 | 67.99 | | hovel | 27.71 | 38.21 | | bus | 53.7 | 87.29 | | towel | 29.1 | 36.15 | | light | 1.81 | 1.84 | | truck | 16.6 | 26.61 | | tower | 26.22 | 53.52 | | chandelier | 40.04 | 63.36 | | awning | 4.01 | 4.69 | | streetlight | 0.0 | 0.0 | | booth | 25.49 | 26.33 | | television receiver | 42.1 | 46.5 | | airplane | 29.76 | 58.47 | | dirt track | 0.0 | 0.0 | | apparel | 18.39 | 31.31 | | pole | 5.33 | 6.02 | | land | 0.0 | 0.0 | | bannister | 0.0 | 0.0 | | escalator | 31.18 | 40.89 | | ottoman | 15.2 | 17.32 | | bottle | 5.38 | 5.86 | | buffet | 23.98 | 31.66 | | poster | 0.92 | 0.93 | | stage | 2.61 | 7.82 | | van | 26.12 | 30.15 | | ship | 72.2 | 90.46 | | fountain | 0.0 | 0.0 | | conveyer belt | 55.73 | 75.84 | | canopy | 5.62 | 7.25 | | washer | 46.22 | 72.65 | | plaything | 1.99 | 2.49 | | swimming pool | 28.28 | 42.91 | | stool | 3.59 | 3.81 | | barrel | 9.03 | 64.75 | | basket | 4.77 | 5.26 | | waterfall | 56.78 | 70.5 | | tent | 54.27 | 98.46 | | bag | 0.0 | 0.0 | | minibike | 44.48 | 52.8 | | cradle | 49.01 | 76.02 | | oven | 8.07 | 11.21 | | ball | 17.48 | 23.93 | | food | 44.34 | 57.61 | | step | 1.78 | 1.8 | | tank | 31.91 | 32.72 | | trade name | 3.58 | 3.62 | | microwave | 18.99 | 22.15 | | pot | 8.8 | 9.21 | | animal | 40.1 | 46.66 | | bicycle | 25.88 | 36.71 | | lake | 0.0 | 0.0 | | dishwasher | 21.71 | 25.05 | | screen | 54.74 | 77.77 | | blanket | 0.0 | 0.0 | | sculpture | 15.15 | 16.15 | | hood | 5.59 | 5.61 | | sconce | 0.0 | 0.0 | | vase | 10.42 | 14.11 | | traffic light | 5.38 | 7.48 | | tray | 0.04 | 0.04 | | ashcan | 10.99 | 15.33 | | fan | 16.19 | 18.79 | | pier | 28.67 | 41.34 | | crt screen | 0.0 | 0.0 | | plate | 17.74 | 20.83 | | monitor | 40.89 | 51.8 | | bulletin board | 1.35 | 1.44 | | shower | 0.0 | 0.0 | | radiator | 20.96 | 22.88 | | glass | 0.0 | 0.0 | | clock | 0.41 | 0.41 | | flag | 7.03 | 7.11 | +---------------------+-------+-------+ 2024/10/25 01:43:33 - mmengine - INFO - Iter(val) [500/500] aAcc: 73.6900 mIoU: 29.8000 mAcc: 40.7600 data_time: 0.0021 time: 0.1621 2024/10/25 01:45:10 - mmengine - INFO - Iter(train) [ 8050/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 15:15:13 time: 1.9521 data_time: 0.0167 memory: 6645 grad_norm: 8.0796 loss: 1.0125 decode.loss_ce: 0.6808 decode.acc_seg: 83.0087 aux.loss_ce: 0.3317 aux.acc_seg: 76.5098 2024/10/25 01:46:49 - mmengine - INFO - Iter(train) [ 8100/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 15:13:37 time: 1.9776 data_time: 0.0147 memory: 6645 grad_norm: 4.9856 loss: 1.1493 decode.loss_ce: 0.7794 decode.acc_seg: 81.5854 aux.loss_ce: 0.3698 aux.acc_seg: 78.2517 2024/10/25 01:48:27 - mmengine - INFO - Iter(train) [ 8150/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 15:11:57 time: 1.9542 data_time: 0.0171 memory: 6645 grad_norm: 6.9076 loss: 1.1094 decode.loss_ce: 0.7561 decode.acc_seg: 77.5763 aux.loss_ce: 0.3533 aux.acc_seg: 75.6010 2024/10/25 01:50:05 - mmengine - INFO - Iter(train) [ 8200/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 15:10:19 time: 1.9664 data_time: 0.0171 memory: 6644 grad_norm: 6.6906 loss: 1.0180 decode.loss_ce: 0.6928 decode.acc_seg: 74.4566 aux.loss_ce: 0.3251 aux.acc_seg: 72.1157 2024/10/25 01:51:44 - mmengine - INFO - Iter(train) [ 8250/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 15:08:42 time: 1.9594 data_time: 0.0163 memory: 6645 grad_norm: 8.4283 loss: 1.1735 decode.loss_ce: 0.7859 decode.acc_seg: 72.5570 aux.loss_ce: 0.3877 aux.acc_seg: 65.7929 2024/10/25 01:53:22 - mmengine - INFO - Iter(train) [ 8300/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 15:07:02 time: 1.9610 data_time: 0.0169 memory: 6646 grad_norm: 6.1961 loss: 0.9009 decode.loss_ce: 0.6149 decode.acc_seg: 79.9454 aux.loss_ce: 0.2860 aux.acc_seg: 76.6777 2024/10/25 01:55:00 - mmengine - INFO - Iter(train) [ 8350/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 15:05:23 time: 1.9608 data_time: 0.0168 memory: 6646 grad_norm: 7.5804 loss: 0.9242 decode.loss_ce: 0.6137 decode.acc_seg: 78.7600 aux.loss_ce: 0.3105 aux.acc_seg: 77.7119 2024/10/25 01:56:38 - mmengine - INFO - Iter(train) [ 8400/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 15:03:48 time: 1.9600 data_time: 0.0176 memory: 6645 grad_norm: 7.2986 loss: 1.0237 decode.loss_ce: 0.6803 decode.acc_seg: 67.9610 aux.loss_ce: 0.3434 aux.acc_seg: 66.7669 2024/10/25 01:58:17 - mmengine - INFO - Iter(train) [ 8450/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 15:02:12 time: 1.9541 data_time: 0.0172 memory: 6645 grad_norm: 7.1019 loss: 0.9553 decode.loss_ce: 0.6493 decode.acc_seg: 79.4166 aux.loss_ce: 0.3060 aux.acc_seg: 77.8892 2024/10/25 01:59:55 - mmengine - INFO - Iter(train) [ 8500/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 15:00:32 time: 1.9582 data_time: 0.0167 memory: 6646 grad_norm: 8.0382 loss: 0.9991 decode.loss_ce: 0.6631 decode.acc_seg: 66.9375 aux.loss_ce: 0.3360 aux.acc_seg: 67.5262 2024/10/25 02:01:33 - mmengine - INFO - Iter(train) [ 8550/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 14:58:52 time: 1.9632 data_time: 0.0171 memory: 6645 grad_norm: 5.1196 loss: 1.0513 decode.loss_ce: 0.7163 decode.acc_seg: 65.6127 aux.loss_ce: 0.3349 aux.acc_seg: 62.0684 2024/10/25 02:03:11 - mmengine - INFO - Iter(train) [ 8600/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 14:57:14 time: 1.9548 data_time: 0.0167 memory: 6646 grad_norm: 7.5361 loss: 1.1099 decode.loss_ce: 0.7705 decode.acc_seg: 69.6574 aux.loss_ce: 0.3395 aux.acc_seg: 70.6569 2024/10/25 02:04:50 - mmengine - INFO - Iter(train) [ 8650/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 14:55:39 time: 1.9557 data_time: 0.0168 memory: 6646 grad_norm: 6.7200 loss: 1.0242 decode.loss_ce: 0.7148 decode.acc_seg: 72.1777 aux.loss_ce: 0.3093 aux.acc_seg: 70.0822 2024/10/25 02:06:28 - mmengine - INFO - Iter(train) [ 8700/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 14:54:02 time: 1.9768 data_time: 0.0159 memory: 6646 grad_norm: 7.9959 loss: 1.2122 decode.loss_ce: 0.8200 decode.acc_seg: 73.5606 aux.loss_ce: 0.3922 aux.acc_seg: 66.4575 2024/10/25 02:08:06 - mmengine - INFO - Iter(train) [ 8750/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 14:52:22 time: 1.9580 data_time: 0.0157 memory: 6646 grad_norm: 5.0630 loss: 0.9281 decode.loss_ce: 0.6349 decode.acc_seg: 71.3332 aux.loss_ce: 0.2933 aux.acc_seg: 71.1952 2024/10/25 02:09:44 - mmengine - INFO - Iter(train) [ 8800/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 14:50:45 time: 1.9771 data_time: 0.0157 memory: 6645 grad_norm: 4.7145 loss: 0.9724 decode.loss_ce: 0.6375 decode.acc_seg: 74.7118 aux.loss_ce: 0.3349 aux.acc_seg: 72.5379 2024/10/25 02:11:22 - mmengine - INFO - Iter(train) [ 8850/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 14:49:06 time: 1.9572 data_time: 0.0161 memory: 6646 grad_norm: 6.0604 loss: 0.9196 decode.loss_ce: 0.6382 decode.acc_seg: 82.5502 aux.loss_ce: 0.2814 aux.acc_seg: 81.7846 2024/10/25 02:13:00 - mmengine - INFO - Iter(train) [ 8900/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 14:47:25 time: 1.9513 data_time: 0.0166 memory: 6647 grad_norm: 7.5233 loss: 0.9984 decode.loss_ce: 0.6855 decode.acc_seg: 74.5921 aux.loss_ce: 0.3129 aux.acc_seg: 71.1024 2024/10/25 02:14:38 - mmengine - INFO - Iter(train) [ 8950/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 14:45:45 time: 1.9856 data_time: 0.0166 memory: 6645 grad_norm: 6.4323 loss: 0.9149 decode.loss_ce: 0.6159 decode.acc_seg: 85.8451 aux.loss_ce: 0.2990 aux.acc_seg: 85.2607 2024/10/25 02:16:16 - mmengine - INFO - Exp name: deeplabv3_mobilemamba_b4-80k_ade20k-512x512_20241024_212005 2024/10/25 02:16:16 - mmengine - INFO - Iter(train) [ 9000/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 14:44:06 time: 1.9573 data_time: 0.0173 memory: 6646 grad_norm: 5.7333 loss: 1.0374 decode.loss_ce: 0.7061 decode.acc_seg: 69.1953 aux.loss_ce: 0.3314 aux.acc_seg: 67.5040 2024/10/25 02:17:54 - mmengine - INFO - Iter(train) [ 9050/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 14:42:27 time: 1.9597 data_time: 0.0165 memory: 6647 grad_norm: 5.4335 loss: 1.0094 decode.loss_ce: 0.7048 decode.acc_seg: 74.5861 aux.loss_ce: 0.3046 aux.acc_seg: 72.7504 2024/10/25 02:19:32 - mmengine - INFO - Iter(train) [ 9100/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 14:40:47 time: 1.9563 data_time: 0.0161 memory: 6645 grad_norm: 6.0351 loss: 1.1000 decode.loss_ce: 0.7492 decode.acc_seg: 73.7241 aux.loss_ce: 0.3507 aux.acc_seg: 69.5486 2024/10/25 02:21:10 - mmengine - INFO - Iter(train) [ 9150/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 14:39:08 time: 1.9586 data_time: 0.0174 memory: 6646 grad_norm: 7.0136 loss: 1.0251 decode.loss_ce: 0.6958 decode.acc_seg: 84.4344 aux.loss_ce: 0.3292 aux.acc_seg: 83.2772 2024/10/25 02:22:49 - mmengine - INFO - Iter(train) [ 9200/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 14:37:30 time: 1.9760 data_time: 0.0152 memory: 6645 grad_norm: 5.6956 loss: 0.9827 decode.loss_ce: 0.6624 decode.acc_seg: 79.7231 aux.loss_ce: 0.3203 aux.acc_seg: 69.2559 2024/10/25 02:24:27 - mmengine - INFO - Iter(train) [ 9250/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 14:35:51 time: 1.9518 data_time: 0.0161 memory: 6646 grad_norm: 6.2347 loss: 0.8517 decode.loss_ce: 0.5975 decode.acc_seg: 77.9488 aux.loss_ce: 0.2542 aux.acc_seg: 78.3986 2024/10/25 02:26:05 - mmengine - INFO - Iter(train) [ 9300/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 14:34:13 time: 1.9601 data_time: 0.0156 memory: 6645 grad_norm: 5.1169 loss: 0.9116 decode.loss_ce: 0.6188 decode.acc_seg: 80.1009 aux.loss_ce: 0.2928 aux.acc_seg: 74.2656 2024/10/25 02:27:43 - mmengine - INFO - Iter(train) [ 9350/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 14:32:33 time: 1.9593 data_time: 0.0156 memory: 6646 grad_norm: 6.7170 loss: 0.9013 decode.loss_ce: 0.6152 decode.acc_seg: 85.9791 aux.loss_ce: 0.2860 aux.acc_seg: 76.5159 2024/10/25 02:29:21 - mmengine - INFO - Iter(train) [ 9400/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 14:30:55 time: 1.9628 data_time: 0.0155 memory: 6646 grad_norm: 5.8990 loss: 0.8925 decode.loss_ce: 0.6061 decode.acc_seg: 81.2404 aux.loss_ce: 0.2864 aux.acc_seg: 77.7563 2024/10/25 02:30:59 - mmengine - INFO - Iter(train) [ 9450/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 14:29:15 time: 1.9578 data_time: 0.0177 memory: 6646 grad_norm: 6.0892 loss: 1.0184 decode.loss_ce: 0.6816 decode.acc_seg: 76.7527 aux.loss_ce: 0.3368 aux.acc_seg: 78.9542 2024/10/25 02:32:37 - mmengine - INFO - Iter(train) [ 9500/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 14:27:37 time: 1.9577 data_time: 0.0173 memory: 6646 grad_norm: 8.9926 loss: 1.1150 decode.loss_ce: 0.7645 decode.acc_seg: 61.9640 aux.loss_ce: 0.3505 aux.acc_seg: 63.1990 2024/10/25 02:34:16 - mmengine - INFO - Iter(train) [ 9550/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 14:26:01 time: 1.9714 data_time: 0.0161 memory: 6645 grad_norm: 6.0782 loss: 0.9932 decode.loss_ce: 0.6799 decode.acc_seg: 78.1585 aux.loss_ce: 0.3133 aux.acc_seg: 74.9470 2024/10/25 02:35:54 - mmengine - INFO - Iter(train) [ 9600/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 14:24:23 time: 1.9595 data_time: 0.0166 memory: 6646 grad_norm: 5.1884 loss: 0.9903 decode.loss_ce: 0.6643 decode.acc_seg: 78.0630 aux.loss_ce: 0.3260 aux.acc_seg: 63.9710 2024/10/25 02:37:32 - mmengine - INFO - Iter(train) [ 9650/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 14:22:44 time: 1.9546 data_time: 0.0164 memory: 6646 grad_norm: 6.0124 loss: 1.0641 decode.loss_ce: 0.7135 decode.acc_seg: 67.1835 aux.loss_ce: 0.3506 aux.acc_seg: 59.0424 2024/10/25 02:39:10 - mmengine - INFO - Iter(train) [ 9700/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 14:21:03 time: 1.9560 data_time: 0.0169 memory: 6646 grad_norm: 6.4763 loss: 0.8527 decode.loss_ce: 0.5900 decode.acc_seg: 78.2023 aux.loss_ce: 0.2627 aux.acc_seg: 77.9561 2024/10/25 02:40:48 - mmengine - INFO - Iter(train) [ 9750/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 14:19:23 time: 1.9604 data_time: 0.0165 memory: 6645 grad_norm: 5.8817 loss: 0.9992 decode.loss_ce: 0.7025 decode.acc_seg: 83.2637 aux.loss_ce: 0.2968 aux.acc_seg: 79.8821 2024/10/25 02:42:26 - mmengine - INFO - Iter(train) [ 9800/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 14:17:44 time: 1.9556 data_time: 0.0176 memory: 6645 grad_norm: 5.6298 loss: 1.1278 decode.loss_ce: 0.7659 decode.acc_seg: 72.1087 aux.loss_ce: 0.3619 aux.acc_seg: 72.3275 2024/10/25 02:44:04 - mmengine - INFO - Iter(train) [ 9850/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 14:16:04 time: 1.9619 data_time: 0.0161 memory: 6646 grad_norm: 6.4918 loss: 1.0565 decode.loss_ce: 0.7351 decode.acc_seg: 75.7377 aux.loss_ce: 0.3213 aux.acc_seg: 72.5250 2024/10/25 02:45:44 - mmengine - INFO - Iter(train) [ 9900/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 14:14:36 time: 1.9505 data_time: 0.0152 memory: 6645 grad_norm: 5.6466 loss: 0.9880 decode.loss_ce: 0.6553 decode.acc_seg: 69.9535 aux.loss_ce: 0.3327 aux.acc_seg: 68.6603 2024/10/25 02:47:22 - mmengine - INFO - Iter(train) [ 9950/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 14:12:56 time: 1.9567 data_time: 0.0159 memory: 6645 grad_norm: 6.5123 loss: 0.9221 decode.loss_ce: 0.6354 decode.acc_seg: 76.7750 aux.loss_ce: 0.2867 aux.acc_seg: 66.9219 2024/10/25 02:49:00 - mmengine - INFO - Exp name: deeplabv3_mobilemamba_b4-80k_ade20k-512x512_20241024_212005 2024/10/25 02:49:00 - mmengine - INFO - Iter(train) [10000/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 14:11:16 time: 1.9502 data_time: 0.0167 memory: 6645 grad_norm: 5.9887 loss: 0.9108 decode.loss_ce: 0.6081 decode.acc_seg: 68.9941 aux.loss_ce: 0.3027 aux.acc_seg: 59.4256 2024/10/25 02:50:38 - mmengine - INFO - Iter(train) [10050/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 14:09:37 time: 1.9573 data_time: 0.0162 memory: 6646 grad_norm: 5.0801 loss: 1.0854 decode.loss_ce: 0.7388 decode.acc_seg: 68.4910 aux.loss_ce: 0.3465 aux.acc_seg: 65.4380 2024/10/25 02:52:15 - mmengine - INFO - Iter(train) [10100/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 14:07:55 time: 1.9586 data_time: 0.0163 memory: 6645 grad_norm: 5.4162 loss: 1.0031 decode.loss_ce: 0.6818 decode.acc_seg: 80.4109 aux.loss_ce: 0.3213 aux.acc_seg: 80.0891 2024/10/25 02:53:53 - mmengine - INFO - Iter(train) [10150/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 14:06:15 time: 1.9567 data_time: 0.0169 memory: 6645 grad_norm: 5.9508 loss: 1.0130 decode.loss_ce: 0.6636 decode.acc_seg: 73.0657 aux.loss_ce: 0.3494 aux.acc_seg: 61.8999 2024/10/25 02:55:31 - mmengine - INFO - Iter(train) [10200/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 14:04:35 time: 1.9627 data_time: 0.0166 memory: 6645 grad_norm: 5.8995 loss: 1.1112 decode.loss_ce: 0.7404 decode.acc_seg: 74.8103 aux.loss_ce: 0.3708 aux.acc_seg: 74.9370 2024/10/25 02:57:09 - mmengine - INFO - Iter(train) [10250/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 14:02:56 time: 1.9609 data_time: 0.0167 memory: 6646 grad_norm: 6.1245 loss: 1.0396 decode.loss_ce: 0.6945 decode.acc_seg: 69.6230 aux.loss_ce: 0.3451 aux.acc_seg: 71.2263 2024/10/25 02:58:47 - mmengine - INFO - Iter(train) [10300/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 14:01:14 time: 1.9570 data_time: 0.0167 memory: 6647 grad_norm: 7.3485 loss: 0.9481 decode.loss_ce: 0.6396 decode.acc_seg: 77.6486 aux.loss_ce: 0.3085 aux.acc_seg: 70.7826 2024/10/25 03:00:25 - mmengine - INFO - Iter(train) [10350/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 13:59:36 time: 1.9482 data_time: 0.0158 memory: 6645 grad_norm: 5.4586 loss: 0.9764 decode.loss_ce: 0.6594 decode.acc_seg: 82.5826 aux.loss_ce: 0.3170 aux.acc_seg: 79.0408 2024/10/25 03:02:03 - mmengine - INFO - Iter(train) [10400/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 13:57:57 time: 1.9754 data_time: 0.0164 memory: 6645 grad_norm: 6.1870 loss: 1.0459 decode.loss_ce: 0.6922 decode.acc_seg: 75.1997 aux.loss_ce: 0.3537 aux.acc_seg: 75.0257 2024/10/25 03:03:42 - mmengine - INFO - Iter(train) [10450/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 13:56:24 time: 1.9561 data_time: 0.0154 memory: 6645 grad_norm: 5.5327 loss: 0.9189 decode.loss_ce: 0.6309 decode.acc_seg: 78.7944 aux.loss_ce: 0.2880 aux.acc_seg: 73.8800 2024/10/25 03:05:20 - mmengine - INFO - Iter(train) [10500/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 13:54:44 time: 1.9609 data_time: 0.0163 memory: 6646 grad_norm: 7.8120 loss: 0.9973 decode.loss_ce: 0.6874 decode.acc_seg: 78.3580 aux.loss_ce: 0.3100 aux.acc_seg: 75.8616 2024/10/25 03:06:58 - mmengine - INFO - Iter(train) [10550/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 13:53:04 time: 1.9550 data_time: 0.0167 memory: 6646 grad_norm: 7.0828 loss: 0.9002 decode.loss_ce: 0.6259 decode.acc_seg: 74.0961 aux.loss_ce: 0.2743 aux.acc_seg: 71.9809 2024/10/25 03:08:36 - mmengine - INFO - Iter(train) [10600/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 13:51:27 time: 1.9561 data_time: 0.0170 memory: 6646 grad_norm: 8.3393 loss: 0.9495 decode.loss_ce: 0.6477 decode.acc_seg: 66.6713 aux.loss_ce: 0.3018 aux.acc_seg: 66.9527 2024/10/25 03:10:15 - mmengine - INFO - Iter(train) [10650/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 13:49:50 time: 1.9614 data_time: 0.0164 memory: 6645 grad_norm: 6.0295 loss: 0.9741 decode.loss_ce: 0.6646 decode.acc_seg: 61.2888 aux.loss_ce: 0.3095 aux.acc_seg: 64.5429 2024/10/25 03:11:52 - mmengine - INFO - Iter(train) [10700/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 13:48:08 time: 1.9505 data_time: 0.0161 memory: 6646 grad_norm: 6.3717 loss: 0.9410 decode.loss_ce: 0.6396 decode.acc_seg: 67.2019 aux.loss_ce: 0.3014 aux.acc_seg: 68.7208 2024/10/25 03:13:30 - mmengine - INFO - Iter(train) [10750/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 13:46:27 time: 1.9613 data_time: 0.0172 memory: 6645 grad_norm: 5.7140 loss: 0.9512 decode.loss_ce: 0.6433 decode.acc_seg: 84.0577 aux.loss_ce: 0.3079 aux.acc_seg: 73.4406 2024/10/25 03:15:08 - mmengine - INFO - Iter(train) [10800/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 13:44:48 time: 1.9540 data_time: 0.0163 memory: 6645 grad_norm: 6.9466 loss: 1.1662 decode.loss_ce: 0.7697 decode.acc_seg: 77.9352 aux.loss_ce: 0.3965 aux.acc_seg: 73.4832 2024/10/25 03:16:46 - mmengine - INFO - Iter(train) [10850/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 13:43:07 time: 1.9578 data_time: 0.0169 memory: 6646 grad_norm: 8.1637 loss: 1.0009 decode.loss_ce: 0.6858 decode.acc_seg: 81.3326 aux.loss_ce: 0.3151 aux.acc_seg: 82.9887 2024/10/25 03:18:24 - mmengine - INFO - Iter(train) [10900/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 13:41:27 time: 1.9609 data_time: 0.0168 memory: 6645 grad_norm: 5.6552 loss: 1.0807 decode.loss_ce: 0.7365 decode.acc_seg: 71.2169 aux.loss_ce: 0.3442 aux.acc_seg: 71.9544 2024/10/25 03:20:02 - mmengine - INFO - Iter(train) [10950/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 13:39:50 time: 1.9566 data_time: 0.0170 memory: 6646 grad_norm: 5.9352 loss: 0.8430 decode.loss_ce: 0.5641 decode.acc_seg: 77.0090 aux.loss_ce: 0.2789 aux.acc_seg: 76.2721 2024/10/25 03:21:42 - mmengine - INFO - Exp name: deeplabv3_mobilemamba_b4-80k_ade20k-512x512_20241024_212005 2024/10/25 03:21:42 - mmengine - INFO - Iter(train) [11000/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 13:38:25 time: 1.9536 data_time: 0.0165 memory: 6646 grad_norm: 4.8265 loss: 0.8826 decode.loss_ce: 0.6018 decode.acc_seg: 80.9739 aux.loss_ce: 0.2808 aux.acc_seg: 81.3739 2024/10/25 03:23:20 - mmengine - INFO - Iter(train) [11050/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 13:36:44 time: 1.9564 data_time: 0.0177 memory: 6646 grad_norm: 5.7037 loss: 0.8835 decode.loss_ce: 0.5940 decode.acc_seg: 82.9777 aux.loss_ce: 0.2894 aux.acc_seg: 76.3589 2024/10/25 03:24:58 - mmengine - INFO - Iter(train) [11100/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 13:35:04 time: 1.9611 data_time: 0.0171 memory: 6645 grad_norm: 5.8498 loss: 0.7685 decode.loss_ce: 0.5141 decode.acc_seg: 77.6899 aux.loss_ce: 0.2544 aux.acc_seg: 72.8818 2024/10/25 03:26:36 - mmengine - INFO - Iter(train) [11150/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 13:33:23 time: 1.9485 data_time: 0.0168 memory: 6645 grad_norm: 6.0778 loss: 1.0239 decode.loss_ce: 0.6821 decode.acc_seg: 80.8146 aux.loss_ce: 0.3417 aux.acc_seg: 72.6482 2024/10/25 03:28:14 - mmengine - INFO - Iter(train) [11200/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 13:31:46 time: 1.9622 data_time: 0.0148 memory: 6645 grad_norm: 5.5928 loss: 0.9204 decode.loss_ce: 0.6385 decode.acc_seg: 87.0085 aux.loss_ce: 0.2819 aux.acc_seg: 86.3256 2024/10/25 03:29:52 - mmengine - INFO - Iter(train) [11250/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 13:30:07 time: 1.9765 data_time: 0.0159 memory: 6646 grad_norm: 5.3889 loss: 1.2057 decode.loss_ce: 0.8189 decode.acc_seg: 80.2595 aux.loss_ce: 0.3868 aux.acc_seg: 81.2907 2024/10/25 03:31:30 - mmengine - INFO - Iter(train) [11300/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 13:28:27 time: 1.9609 data_time: 0.0169 memory: 6645 grad_norm: 8.3309 loss: 0.9635 decode.loss_ce: 0.6560 decode.acc_seg: 80.7473 aux.loss_ce: 0.3075 aux.acc_seg: 79.1290 2024/10/25 03:33:08 - mmengine - INFO - Iter(train) [11350/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 13:26:48 time: 1.9796 data_time: 0.0150 memory: 6645 grad_norm: 6.6159 loss: 0.9023 decode.loss_ce: 0.6043 decode.acc_seg: 82.5668 aux.loss_ce: 0.2980 aux.acc_seg: 72.2248 2024/10/25 03:34:46 - mmengine - INFO - Iter(train) [11400/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 13:25:07 time: 1.9559 data_time: 0.0160 memory: 6646 grad_norm: 5.6064 loss: 0.8542 decode.loss_ce: 0.5817 decode.acc_seg: 74.9009 aux.loss_ce: 0.2725 aux.acc_seg: 66.0354 2024/10/25 03:36:24 - mmengine - INFO - Iter(train) [11450/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 13:23:26 time: 1.9530 data_time: 0.0162 memory: 6645 grad_norm: 4.8200 loss: 0.8869 decode.loss_ce: 0.5892 decode.acc_seg: 74.7312 aux.loss_ce: 0.2977 aux.acc_seg: 67.6597 2024/10/25 03:38:02 - mmengine - INFO - Iter(train) [11500/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 13:21:47 time: 1.9564 data_time: 0.0168 memory: 6647 grad_norm: 6.4014 loss: 0.9770 decode.loss_ce: 0.6711 decode.acc_seg: 79.0326 aux.loss_ce: 0.3059 aux.acc_seg: 72.9826 2024/10/25 03:39:43 - mmengine - INFO - Iter(train) [11550/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 13:20:27 time: 1.9576 data_time: 0.0172 memory: 6645 grad_norm: 6.4791 loss: 0.8309 decode.loss_ce: 0.5605 decode.acc_seg: 79.0052 aux.loss_ce: 0.2704 aux.acc_seg: 73.1931 2024/10/25 03:41:21 - mmengine - INFO - Iter(train) [11600/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 13:18:46 time: 1.9481 data_time: 0.0172 memory: 6646 grad_norm: 5.7587 loss: 0.9950 decode.loss_ce: 0.6786 decode.acc_seg: 85.3744 aux.loss_ce: 0.3164 aux.acc_seg: 79.9759 2024/10/25 03:42:59 - mmengine - INFO - Iter(train) [11650/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 13:17:06 time: 1.9598 data_time: 0.0165 memory: 6645 grad_norm: 7.2630 loss: 1.1483 decode.loss_ce: 0.7664 decode.acc_seg: 75.8632 aux.loss_ce: 0.3819 aux.acc_seg: 74.1618 2024/10/25 03:44:37 - mmengine - INFO - Iter(train) [11700/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 13:15:27 time: 1.9567 data_time: 0.0161 memory: 6645 grad_norm: 6.1317 loss: 0.8248 decode.loss_ce: 0.5585 decode.acc_seg: 67.3240 aux.loss_ce: 0.2662 aux.acc_seg: 66.3271 2024/10/25 03:46:15 - mmengine - INFO - Iter(train) [11750/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 13:13:48 time: 1.9563 data_time: 0.0158 memory: 6648 grad_norm: 6.1810 loss: 0.9360 decode.loss_ce: 0.6488 decode.acc_seg: 78.1176 aux.loss_ce: 0.2872 aux.acc_seg: 77.4105 2024/10/25 03:47:52 - mmengine - INFO - Iter(train) [11800/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 13:12:07 time: 1.9523 data_time: 0.0167 memory: 6646 grad_norm: 6.2061 loss: 0.9314 decode.loss_ce: 0.6222 decode.acc_seg: 69.3918 aux.loss_ce: 0.3092 aux.acc_seg: 68.5468 2024/10/25 03:49:31 - mmengine - INFO - Iter(train) [11850/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 13:10:30 time: 1.9611 data_time: 0.0163 memory: 6645 grad_norm: 5.0478 loss: 0.8028 decode.loss_ce: 0.5377 decode.acc_seg: 78.2972 aux.loss_ce: 0.2651 aux.acc_seg: 75.8236 2024/10/25 03:51:10 - mmengine - INFO - Iter(train) [11900/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 13:08:56 time: 1.9583 data_time: 0.0163 memory: 6646 grad_norm: 5.2155 loss: 0.8472 decode.loss_ce: 0.5802 decode.acc_seg: 77.8723 aux.loss_ce: 0.2670 aux.acc_seg: 77.4935 2024/10/25 03:52:48 - mmengine - INFO - Iter(train) [11950/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 13:07:17 time: 1.9811 data_time: 0.0158 memory: 6647 grad_norm: 7.9149 loss: 0.9646 decode.loss_ce: 0.6583 decode.acc_seg: 69.1678 aux.loss_ce: 0.3063 aux.acc_seg: 69.8351 2024/10/25 03:54:26 - mmengine - INFO - Exp name: deeplabv3_mobilemamba_b4-80k_ade20k-512x512_20241024_212005 2024/10/25 03:54:26 - mmengine - INFO - Iter(train) [12000/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 13:05:38 time: 1.9604 data_time: 0.0163 memory: 6646 grad_norm: 7.0400 loss: 1.0758 decode.loss_ce: 0.7231 decode.acc_seg: 74.2412 aux.loss_ce: 0.3527 aux.acc_seg: 76.3873 2024/10/25 03:56:04 - mmengine - INFO - Iter(train) [12050/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 13:03:59 time: 1.9545 data_time: 0.0173 memory: 6645 grad_norm: 8.0809 loss: 0.9727 decode.loss_ce: 0.6523 decode.acc_seg: 71.6309 aux.loss_ce: 0.3204 aux.acc_seg: 72.5501 2024/10/25 03:57:42 - mmengine - INFO - Iter(train) [12100/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 13:02:21 time: 1.9578 data_time: 0.0156 memory: 6645 grad_norm: 5.6440 loss: 0.9399 decode.loss_ce: 0.6392 decode.acc_seg: 71.8271 aux.loss_ce: 0.3007 aux.acc_seg: 71.5321 2024/10/25 03:59:20 - mmengine - INFO - Iter(train) [12150/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 13:00:40 time: 1.9502 data_time: 0.0167 memory: 6646 grad_norm: 5.4364 loss: 0.7811 decode.loss_ce: 0.5196 decode.acc_seg: 74.1464 aux.loss_ce: 0.2615 aux.acc_seg: 73.8600 2024/10/25 04:00:58 - mmengine - INFO - Iter(train) [12200/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 12:58:59 time: 1.9503 data_time: 0.0172 memory: 6645 grad_norm: 5.3577 loss: 0.8829 decode.loss_ce: 0.6021 decode.acc_seg: 76.8520 aux.loss_ce: 0.2807 aux.acc_seg: 75.5151 2024/10/25 04:02:35 - mmengine - INFO - Iter(train) [12250/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 12:57:19 time: 1.9512 data_time: 0.0166 memory: 6645 grad_norm: 8.9350 loss: 0.8721 decode.loss_ce: 0.5948 decode.acc_seg: 70.7846 aux.loss_ce: 0.2773 aux.acc_seg: 66.6751 2024/10/25 04:04:13 - mmengine - INFO - Iter(train) [12300/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 12:55:39 time: 1.9560 data_time: 0.0184 memory: 6646 grad_norm: 8.8112 loss: 0.9327 decode.loss_ce: 0.6213 decode.acc_seg: 80.8212 aux.loss_ce: 0.3114 aux.acc_seg: 83.1206 2024/10/25 04:05:51 - mmengine - INFO - Iter(train) [12350/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 12:53:59 time: 1.9530 data_time: 0.0174 memory: 6646 grad_norm: 5.7592 loss: 0.8788 decode.loss_ce: 0.6024 decode.acc_seg: 86.6644 aux.loss_ce: 0.2764 aux.acc_seg: 84.1213 2024/10/25 04:07:29 - mmengine - INFO - Iter(train) [12400/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 12:52:20 time: 1.9531 data_time: 0.0156 memory: 6646 grad_norm: 6.9604 loss: 0.9644 decode.loss_ce: 0.6370 decode.acc_seg: 82.8529 aux.loss_ce: 0.3274 aux.acc_seg: 74.8206 2024/10/25 04:09:07 - mmengine - INFO - Iter(train) [12450/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 12:50:41 time: 1.9530 data_time: 0.0159 memory: 6646 grad_norm: 5.9601 loss: 0.9494 decode.loss_ce: 0.6515 decode.acc_seg: 76.2734 aux.loss_ce: 0.2979 aux.acc_seg: 76.5575 2024/10/25 04:10:45 - mmengine - INFO - Iter(train) [12500/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 12:49:02 time: 1.9482 data_time: 0.0155 memory: 6645 grad_norm: 5.5984 loss: 1.0723 decode.loss_ce: 0.7190 decode.acc_seg: 76.9292 aux.loss_ce: 0.3533 aux.acc_seg: 72.4000 2024/10/25 04:12:23 - mmengine - INFO - Iter(train) [12550/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 12:47:22 time: 1.9565 data_time: 0.0165 memory: 6645 grad_norm: 5.5530 loss: 0.8794 decode.loss_ce: 0.6046 decode.acc_seg: 72.9159 aux.loss_ce: 0.2748 aux.acc_seg: 71.8918 2024/10/25 04:14:01 - mmengine - INFO - Iter(train) [12600/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 12:45:43 time: 1.9538 data_time: 0.0166 memory: 6646 grad_norm: 6.3655 loss: 0.8553 decode.loss_ce: 0.5689 decode.acc_seg: 80.8104 aux.loss_ce: 0.2865 aux.acc_seg: 81.2535 2024/10/25 04:15:39 - mmengine - INFO - Iter(train) [12650/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 12:44:04 time: 1.9571 data_time: 0.0167 memory: 6645 grad_norm: 6.1101 loss: 0.8930 decode.loss_ce: 0.6143 decode.acc_seg: 77.3081 aux.loss_ce: 0.2787 aux.acc_seg: 68.9723 2024/10/25 04:17:17 - mmengine - INFO - Iter(train) [12700/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 12:42:24 time: 1.9537 data_time: 0.0156 memory: 6645 grad_norm: 6.0252 loss: 0.9435 decode.loss_ce: 0.6259 decode.acc_seg: 64.9274 aux.loss_ce: 0.3176 aux.acc_seg: 67.2859 2024/10/25 04:18:55 - mmengine - INFO - Iter(train) [12750/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 12:40:46 time: 1.9707 data_time: 0.0183 memory: 6646 grad_norm: 6.6194 loss: 0.9508 decode.loss_ce: 0.6353 decode.acc_seg: 80.2388 aux.loss_ce: 0.3155 aux.acc_seg: 72.1835 2024/10/25 04:20:33 - mmengine - INFO - Iter(train) [12800/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 12:39:06 time: 1.9558 data_time: 0.0170 memory: 6645 grad_norm: 5.4860 loss: 0.9395 decode.loss_ce: 0.6330 decode.acc_seg: 75.4724 aux.loss_ce: 0.3066 aux.acc_seg: 70.7673 2024/10/25 04:22:10 - mmengine - INFO - Iter(train) [12850/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 12:37:24 time: 1.9517 data_time: 0.0174 memory: 6646 grad_norm: 6.8208 loss: 1.1161 decode.loss_ce: 0.7421 decode.acc_seg: 69.0436 aux.loss_ce: 0.3740 aux.acc_seg: 63.3797 2024/10/25 04:23:49 - mmengine - INFO - Iter(train) [12900/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 12:35:46 time: 1.9627 data_time: 0.0158 memory: 6646 grad_norm: 6.0214 loss: 0.8607 decode.loss_ce: 0.5859 decode.acc_seg: 80.6863 aux.loss_ce: 0.2748 aux.acc_seg: 79.6933 2024/10/25 04:25:27 - mmengine - INFO - Iter(train) [12950/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 12:34:07 time: 1.9535 data_time: 0.0161 memory: 6645 grad_norm: 6.0451 loss: 0.9952 decode.loss_ce: 0.6774 decode.acc_seg: 79.4535 aux.loss_ce: 0.3179 aux.acc_seg: 76.3367 2024/10/25 04:27:05 - mmengine - INFO - Exp name: deeplabv3_mobilemamba_b4-80k_ade20k-512x512_20241024_212005 2024/10/25 04:27:05 - mmengine - INFO - Iter(train) [13000/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 12:32:30 time: 1.9491 data_time: 0.0171 memory: 6645 grad_norm: 5.6558 loss: 0.8608 decode.loss_ce: 0.5860 decode.acc_seg: 69.6846 aux.loss_ce: 0.2748 aux.acc_seg: 67.3707 2024/10/25 04:28:43 - mmengine - INFO - Iter(train) [13050/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 12:30:52 time: 1.9471 data_time: 0.0165 memory: 6645 grad_norm: 5.2203 loss: 0.8284 decode.loss_ce: 0.5586 decode.acc_seg: 72.9786 aux.loss_ce: 0.2698 aux.acc_seg: 70.0521 2024/10/25 04:30:21 - mmengine - INFO - Iter(train) [13100/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 12:29:12 time: 1.9530 data_time: 0.0166 memory: 6645 grad_norm: 6.2195 loss: 0.8927 decode.loss_ce: 0.6094 decode.acc_seg: 73.3774 aux.loss_ce: 0.2833 aux.acc_seg: 74.1169 2024/10/25 04:31:59 - mmengine - INFO - Iter(train) [13150/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 12:27:33 time: 1.9669 data_time: 0.0166 memory: 6646 grad_norm: 5.5045 loss: 0.9509 decode.loss_ce: 0.6395 decode.acc_seg: 81.1774 aux.loss_ce: 0.3114 aux.acc_seg: 83.3714 2024/10/25 04:33:37 - mmengine - INFO - Iter(train) [13200/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 12:25:53 time: 1.9482 data_time: 0.0168 memory: 6645 grad_norm: 5.2181 loss: 0.9842 decode.loss_ce: 0.6783 decode.acc_seg: 75.2317 aux.loss_ce: 0.3059 aux.acc_seg: 70.9505 2024/10/25 04:35:15 - mmengine - INFO - Iter(train) [13250/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 12:24:13 time: 1.9556 data_time: 0.0172 memory: 6646 grad_norm: 5.6689 loss: 0.7547 decode.loss_ce: 0.5040 decode.acc_seg: 86.1204 aux.loss_ce: 0.2506 aux.acc_seg: 83.1047 2024/10/25 04:36:53 - mmengine - INFO - Iter(train) [13300/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 12:22:34 time: 1.9600 data_time: 0.0163 memory: 6646 grad_norm: 4.9477 loss: 0.9765 decode.loss_ce: 0.6462 decode.acc_seg: 83.0018 aux.loss_ce: 0.3302 aux.acc_seg: 83.1732 2024/10/25 04:38:31 - mmengine - INFO - Iter(train) [13350/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 12:20:54 time: 1.9678 data_time: 0.0168 memory: 6645 grad_norm: 6.6394 loss: 0.9592 decode.loss_ce: 0.6441 decode.acc_seg: 77.9425 aux.loss_ce: 0.3151 aux.acc_seg: 72.7475 2024/10/25 04:40:08 - mmengine - INFO - Iter(train) [13400/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 12:19:15 time: 1.9491 data_time: 0.0172 memory: 6646 grad_norm: 6.0536 loss: 0.8535 decode.loss_ce: 0.5951 decode.acc_seg: 81.7325 aux.loss_ce: 0.2584 aux.acc_seg: 77.8527 2024/10/25 04:41:46 - mmengine - INFO - Iter(train) [13450/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 12:17:34 time: 1.9473 data_time: 0.0170 memory: 6646 grad_norm: 5.8019 loss: 0.9718 decode.loss_ce: 0.6580 decode.acc_seg: 77.2928 aux.loss_ce: 0.3138 aux.acc_seg: 73.7573 2024/10/25 04:43:24 - mmengine - INFO - Iter(train) [13500/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 12:15:53 time: 1.9502 data_time: 0.0169 memory: 6646 grad_norm: 5.2235 loss: 0.9593 decode.loss_ce: 0.6367 decode.acc_seg: 66.6042 aux.loss_ce: 0.3226 aux.acc_seg: 64.5496 2024/10/25 04:45:01 - mmengine - INFO - Iter(train) [13550/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 12:14:13 time: 1.9516 data_time: 0.0173 memory: 6647 grad_norm: 7.2143 loss: 1.0487 decode.loss_ce: 0.7262 decode.acc_seg: 74.8723 aux.loss_ce: 0.3226 aux.acc_seg: 70.9673 2024/10/25 04:46:43 - mmengine - INFO - Iter(train) [13600/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 12:12:51 time: 1.9596 data_time: 0.0156 memory: 6645 grad_norm: 6.5743 loss: 0.9562 decode.loss_ce: 0.6480 decode.acc_seg: 77.0137 aux.loss_ce: 0.3082 aux.acc_seg: 74.4410 2024/10/25 04:48:21 - mmengine - INFO - Iter(train) [13650/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 12:11:12 time: 1.9585 data_time: 0.0158 memory: 6646 grad_norm: 7.8204 loss: 0.8319 decode.loss_ce: 0.5660 decode.acc_seg: 75.4983 aux.loss_ce: 0.2659 aux.acc_seg: 71.5024 2024/10/25 04:49:59 - mmengine - INFO - Iter(train) [13700/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 12:09:32 time: 1.9630 data_time: 0.0176 memory: 6645 grad_norm: 5.9456 loss: 0.8566 decode.loss_ce: 0.5772 decode.acc_seg: 82.0370 aux.loss_ce: 0.2794 aux.acc_seg: 78.8373 2024/10/25 04:51:37 - mmengine - INFO - Iter(train) [13750/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 12:07:52 time: 1.9523 data_time: 0.0169 memory: 6646 grad_norm: 6.2300 loss: 0.9070 decode.loss_ce: 0.6053 decode.acc_seg: 71.4653 aux.loss_ce: 0.3016 aux.acc_seg: 59.6150 2024/10/25 04:53:14 - mmengine - INFO - Iter(train) [13800/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 12:06:12 time: 1.9569 data_time: 0.0165 memory: 6645 grad_norm: 7.9319 loss: 0.7980 decode.loss_ce: 0.5326 decode.acc_seg: 76.4401 aux.loss_ce: 0.2654 aux.acc_seg: 69.2189 2024/10/25 04:54:52 - mmengine - INFO - Iter(train) [13850/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 12:04:32 time: 1.9541 data_time: 0.0164 memory: 6645 grad_norm: 5.3331 loss: 0.9939 decode.loss_ce: 0.6702 decode.acc_seg: 82.0687 aux.loss_ce: 0.3237 aux.acc_seg: 82.3883 2024/10/25 04:56:30 - mmengine - INFO - Iter(train) [13900/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 12:02:53 time: 1.9606 data_time: 0.0170 memory: 6646 grad_norm: 4.4330 loss: 0.8902 decode.loss_ce: 0.6058 decode.acc_seg: 67.8275 aux.loss_ce: 0.2844 aux.acc_seg: 67.9538 2024/10/25 04:58:08 - mmengine - INFO - Iter(train) [13950/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 12:01:13 time: 1.9662 data_time: 0.0158 memory: 6645 grad_norm: 5.0221 loss: 0.9487 decode.loss_ce: 0.6367 decode.acc_seg: 73.1154 aux.loss_ce: 0.3120 aux.acc_seg: 68.6458 2024/10/25 04:59:45 - mmengine - INFO - Exp name: deeplabv3_mobilemamba_b4-80k_ade20k-512x512_20241024_212005 2024/10/25 04:59:45 - mmengine - INFO - Iter(train) [14000/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 11:59:32 time: 1.9523 data_time: 0.0171 memory: 6645 grad_norm: 8.1325 loss: 0.8899 decode.loss_ce: 0.5958 decode.acc_seg: 79.0227 aux.loss_ce: 0.2941 aux.acc_seg: 68.3568 2024/10/25 05:01:23 - mmengine - INFO - Iter(train) [14050/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 11:57:52 time: 1.9503 data_time: 0.0173 memory: 6646 grad_norm: 5.6419 loss: 0.7388 decode.loss_ce: 0.5085 decode.acc_seg: 84.5305 aux.loss_ce: 0.2303 aux.acc_seg: 83.0019 2024/10/25 05:03:01 - mmengine - INFO - Iter(train) [14100/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 11:56:13 time: 1.9699 data_time: 0.0161 memory: 6646 grad_norm: 5.4107 loss: 0.8413 decode.loss_ce: 0.5641 decode.acc_seg: 83.2355 aux.loss_ce: 0.2772 aux.acc_seg: 82.9527 2024/10/25 05:04:39 - mmengine - INFO - Iter(train) [14150/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 11:54:32 time: 1.9532 data_time: 0.0158 memory: 6645 grad_norm: 5.8724 loss: 0.8242 decode.loss_ce: 0.5683 decode.acc_seg: 85.9712 aux.loss_ce: 0.2559 aux.acc_seg: 81.0889 2024/10/25 05:06:17 - mmengine - INFO - Iter(train) [14200/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 11:52:53 time: 1.9511 data_time: 0.0165 memory: 6646 grad_norm: 7.3496 loss: 1.0712 decode.loss_ce: 0.6915 decode.acc_seg: 82.6021 aux.loss_ce: 0.3797 aux.acc_seg: 76.9021 2024/10/25 05:07:54 - mmengine - INFO - Iter(train) [14250/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 11:51:12 time: 1.9485 data_time: 0.0164 memory: 6646 grad_norm: 5.1632 loss: 0.9490 decode.loss_ce: 0.6345 decode.acc_seg: 69.5460 aux.loss_ce: 0.3144 aux.acc_seg: 73.2183 2024/10/25 05:09:32 - mmengine - INFO - Iter(train) [14300/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 11:49:31 time: 1.9468 data_time: 0.0168 memory: 6646 grad_norm: 4.6750 loss: 0.8764 decode.loss_ce: 0.5889 decode.acc_seg: 85.2853 aux.loss_ce: 0.2875 aux.acc_seg: 81.7458 2024/10/25 05:11:10 - mmengine - INFO - Iter(train) [14350/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 11:47:52 time: 1.9508 data_time: 0.0153 memory: 6645 grad_norm: 5.3449 loss: 0.8724 decode.loss_ce: 0.5878 decode.acc_seg: 71.2524 aux.loss_ce: 0.2846 aux.acc_seg: 71.2705 2024/10/25 05:12:48 - mmengine - INFO - Iter(train) [14400/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 11:46:13 time: 1.9662 data_time: 0.0169 memory: 6646 grad_norm: 8.1254 loss: 0.9520 decode.loss_ce: 0.6433 decode.acc_seg: 81.3674 aux.loss_ce: 0.3087 aux.acc_seg: 76.8591 2024/10/25 05:14:26 - mmengine - INFO - Iter(train) [14450/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 11:44:33 time: 1.9504 data_time: 0.0165 memory: 6646 grad_norm: 8.1199 loss: 0.8376 decode.loss_ce: 0.5579 decode.acc_seg: 76.0294 aux.loss_ce: 0.2797 aux.acc_seg: 66.0402 2024/10/25 05:16:03 - mmengine - INFO - Iter(train) [14500/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 11:42:53 time: 1.9544 data_time: 0.0163 memory: 6646 grad_norm: 6.1711 loss: 0.8733 decode.loss_ce: 0.5916 decode.acc_seg: 66.4254 aux.loss_ce: 0.2817 aux.acc_seg: 64.8869 2024/10/25 05:17:44 - mmengine - INFO - Iter(train) [14550/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 11:41:25 time: 1.9566 data_time: 0.0165 memory: 6645 grad_norm: 5.8967 loss: 0.9511 decode.loss_ce: 0.6307 decode.acc_seg: 57.6211 aux.loss_ce: 0.3204 aux.acc_seg: 54.3537 2024/10/25 05:19:21 - mmengine - INFO - Iter(train) [14600/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 11:39:45 time: 1.9521 data_time: 0.0168 memory: 6645 grad_norm: 6.9730 loss: 0.9845 decode.loss_ce: 0.6789 decode.acc_seg: 71.3242 aux.loss_ce: 0.3056 aux.acc_seg: 78.3785 2024/10/25 05:20:59 - mmengine - INFO - Iter(train) [14650/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 11:38:06 time: 1.9610 data_time: 0.0154 memory: 6645 grad_norm: 5.5560 loss: 0.9006 decode.loss_ce: 0.6062 decode.acc_seg: 67.6131 aux.loss_ce: 0.2944 aux.acc_seg: 66.5854 2024/10/25 05:22:38 - mmengine - INFO - Iter(train) [14700/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 11:36:28 time: 1.9823 data_time: 0.0168 memory: 6645 grad_norm: 5.0759 loss: 0.7822 decode.loss_ce: 0.5191 decode.acc_seg: 81.5218 aux.loss_ce: 0.2631 aux.acc_seg: 78.0011 2024/10/25 05:24:15 - mmengine - INFO - Iter(train) [14750/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 11:34:49 time: 1.9513 data_time: 0.0171 memory: 6646 grad_norm: 5.7485 loss: 0.9753 decode.loss_ce: 0.6704 decode.acc_seg: 84.7947 aux.loss_ce: 0.3050 aux.acc_seg: 85.8367 2024/10/25 05:25:53 - mmengine - INFO - Iter(train) [14800/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 11:33:08 time: 1.9516 data_time: 0.0164 memory: 6645 grad_norm: 6.3547 loss: 0.9192 decode.loss_ce: 0.6166 decode.acc_seg: 73.0985 aux.loss_ce: 0.3026 aux.acc_seg: 68.1375 2024/10/25 05:27:31 - mmengine - INFO - Iter(train) [14850/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 11:31:28 time: 1.9515 data_time: 0.0166 memory: 6646 grad_norm: 5.4174 loss: 0.6778 decode.loss_ce: 0.4562 decode.acc_seg: 85.6904 aux.loss_ce: 0.2216 aux.acc_seg: 83.2769 2024/10/25 05:29:09 - mmengine - INFO - Iter(train) [14900/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 11:29:49 time: 1.9549 data_time: 0.0165 memory: 6645 grad_norm: 6.8430 loss: 0.7448 decode.loss_ce: 0.5057 decode.acc_seg: 77.7458 aux.loss_ce: 0.2391 aux.acc_seg: 80.3724 2024/10/25 05:30:47 - mmengine - INFO - Iter(train) [14950/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 11:28:09 time: 1.9521 data_time: 0.0178 memory: 6646 grad_norm: 5.3238 loss: 0.8547 decode.loss_ce: 0.5645 decode.acc_seg: 80.1612 aux.loss_ce: 0.2901 aux.acc_seg: 77.0910 2024/10/25 05:32:25 - mmengine - INFO - Exp name: deeplabv3_mobilemamba_b4-80k_ade20k-512x512_20241024_212005 2024/10/25 05:32:25 - mmengine - INFO - Iter(train) [15000/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 11:26:32 time: 1.9919 data_time: 0.0158 memory: 6647 grad_norm: 4.8724 loss: 0.9290 decode.loss_ce: 0.6267 decode.acc_seg: 83.7816 aux.loss_ce: 0.3023 aux.acc_seg: 78.8761 2024/10/25 05:34:03 - mmengine - INFO - Iter(train) [15050/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 11:24:52 time: 1.9559 data_time: 0.0168 memory: 6645 grad_norm: 6.8441 loss: 0.8166 decode.loss_ce: 0.5467 decode.acc_seg: 81.6923 aux.loss_ce: 0.2699 aux.acc_seg: 77.8692 2024/10/25 05:35:44 - mmengine - INFO - Iter(train) [15100/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 11:23:26 time: 1.9486 data_time: 0.0170 memory: 6647 grad_norm: 6.6675 loss: 0.8109 decode.loss_ce: 0.5407 decode.acc_seg: 83.7164 aux.loss_ce: 0.2702 aux.acc_seg: 82.2102 2024/10/25 05:37:21 - mmengine - INFO - Iter(train) [15150/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 11:21:46 time: 1.9527 data_time: 0.0170 memory: 6646 grad_norm: 7.5712 loss: 0.7249 decode.loss_ce: 0.4848 decode.acc_seg: 85.4850 aux.loss_ce: 0.2401 aux.acc_seg: 84.1863 2024/10/25 05:38:59 - mmengine - INFO - Iter(train) [15200/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 11:20:07 time: 1.9509 data_time: 0.0174 memory: 6645 grad_norm: 5.9789 loss: 0.8733 decode.loss_ce: 0.5963 decode.acc_seg: 79.2516 aux.loss_ce: 0.2771 aux.acc_seg: 78.1282 2024/10/25 05:40:37 - mmengine - INFO - Iter(train) [15250/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 11:18:27 time: 1.9498 data_time: 0.0177 memory: 6645 grad_norm: 6.3391 loss: 0.8737 decode.loss_ce: 0.6029 decode.acc_seg: 79.0443 aux.loss_ce: 0.2708 aux.acc_seg: 74.6288 2024/10/25 05:42:15 - mmengine - INFO - Iter(train) [15300/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 11:16:47 time: 1.9548 data_time: 0.0162 memory: 6645 grad_norm: 7.4834 loss: 0.7400 decode.loss_ce: 0.5087 decode.acc_seg: 84.3191 aux.loss_ce: 0.2313 aux.acc_seg: 82.5243 2024/10/25 05:43:53 - mmengine - INFO - Iter(train) [15350/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 11:15:07 time: 1.9446 data_time: 0.0167 memory: 6645 grad_norm: 6.6492 loss: 0.8719 decode.loss_ce: 0.5938 decode.acc_seg: 74.8268 aux.loss_ce: 0.2782 aux.acc_seg: 71.9306 2024/10/25 05:45:30 - mmengine - INFO - Iter(train) [15400/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 11:13:27 time: 1.9496 data_time: 0.0164 memory: 6644 grad_norm: 5.2144 loss: 0.9613 decode.loss_ce: 0.6405 decode.acc_seg: 66.9731 aux.loss_ce: 0.3208 aux.acc_seg: 57.6042 2024/10/25 05:47:08 - mmengine - INFO - Iter(train) [15450/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 11:11:48 time: 1.9538 data_time: 0.0160 memory: 6646 grad_norm: 8.7983 loss: 0.8433 decode.loss_ce: 0.5573 decode.acc_seg: 81.8254 aux.loss_ce: 0.2860 aux.acc_seg: 80.5521 2024/10/25 05:48:46 - mmengine - INFO - Iter(train) [15500/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 11:10:10 time: 1.9486 data_time: 0.0168 memory: 6645 grad_norm: 7.4338 loss: 0.9123 decode.loss_ce: 0.6335 decode.acc_seg: 80.8244 aux.loss_ce: 0.2787 aux.acc_seg: 84.1190 2024/10/25 05:50:24 - mmengine - INFO - Iter(train) [15550/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 11:08:30 time: 1.9706 data_time: 0.0163 memory: 6647 grad_norm: 5.9319 loss: 0.6928 decode.loss_ce: 0.4727 decode.acc_seg: 82.7747 aux.loss_ce: 0.2201 aux.acc_seg: 79.8186 2024/10/25 05:52:02 - mmengine - INFO - Iter(train) [15600/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 11:06:51 time: 1.9554 data_time: 0.0165 memory: 6646 grad_norm: 6.4155 loss: 0.7820 decode.loss_ce: 0.5194 decode.acc_seg: 75.7165 aux.loss_ce: 0.2626 aux.acc_seg: 73.3307 2024/10/25 05:53:44 - mmengine - INFO - Iter(train) [15650/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 11:05:29 time: 1.9540 data_time: 0.0151 memory: 6646 grad_norm: 6.2949 loss: 0.8844 decode.loss_ce: 0.5992 decode.acc_seg: 79.9092 aux.loss_ce: 0.2852 aux.acc_seg: 73.9109 2024/10/25 05:55:22 - mmengine - INFO - Iter(train) [15700/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 11:03:50 time: 1.9604 data_time: 0.0165 memory: 6645 grad_norm: 5.3284 loss: 0.9868 decode.loss_ce: 0.6383 decode.acc_seg: 65.8157 aux.loss_ce: 0.3486 aux.acc_seg: 57.3353 2024/10/25 05:57:00 - mmengine - INFO - Iter(train) [15750/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 11:02:10 time: 1.9514 data_time: 0.0164 memory: 6645 grad_norm: 7.3461 loss: 0.9051 decode.loss_ce: 0.6093 decode.acc_seg: 80.2923 aux.loss_ce: 0.2957 aux.acc_seg: 80.7015 2024/10/25 05:58:37 - mmengine - INFO - Iter(train) [15800/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 11:00:30 time: 1.9518 data_time: 0.0157 memory: 6645 grad_norm: 4.6399 loss: 0.7500 decode.loss_ce: 0.4945 decode.acc_seg: 84.4157 aux.loss_ce: 0.2555 aux.acc_seg: 81.5798 2024/10/25 06:00:15 - mmengine - INFO - Iter(train) [15850/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 10:58:50 time: 1.9540 data_time: 0.0169 memory: 6645 grad_norm: 5.6652 loss: 0.9070 decode.loss_ce: 0.6111 decode.acc_seg: 86.4083 aux.loss_ce: 0.2959 aux.acc_seg: 82.2861 2024/10/25 06:01:53 - mmengine - INFO - Iter(train) [15900/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 10:57:10 time: 1.9501 data_time: 0.0165 memory: 6646 grad_norm: 6.4599 loss: 0.9638 decode.loss_ce: 0.6710 decode.acc_seg: 82.5766 aux.loss_ce: 0.2928 aux.acc_seg: 81.0005 2024/10/25 06:03:30 - mmengine - INFO - Iter(train) [15950/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 10:55:29 time: 1.9492 data_time: 0.0160 memory: 6646 grad_norm: 4.2971 loss: 0.8648 decode.loss_ce: 0.5709 decode.acc_seg: 82.3445 aux.loss_ce: 0.2939 aux.acc_seg: 81.1413 2024/10/25 06:05:08 - mmengine - INFO - Exp name: deeplabv3_mobilemamba_b4-80k_ade20k-512x512_20241024_212005 2024/10/25 06:05:08 - mmengine - INFO - Iter(train) [16000/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 10:53:49 time: 1.9424 data_time: 0.0172 memory: 6645 grad_norm: 5.2396 loss: 0.8712 decode.loss_ce: 0.5912 decode.acc_seg: 72.7627 aux.loss_ce: 0.2800 aux.acc_seg: 66.6864 2024/10/25 06:05:08 - mmengine - INFO - Saving checkpoint at 16000 iterations 2024/10/25 06:05:12 - mmengine - INFO - Iter(val) [ 50/500] eta: 0:00:15 time: 0.0328 data_time: 0.0016 memory: 1049 2024/10/25 06:05:14 - mmengine - INFO - Iter(val) [100/500] eta: 0:00:13 time: 0.0327 data_time: 0.0016 memory: 1117 2024/10/25 06:05:16 - mmengine - INFO - Iter(val) [150/500] eta: 0:00:11 time: 0.0333 data_time: 0.0018 memory: 833 2024/10/25 06:05:17 - mmengine - INFO - Iter(val) [200/500] eta: 0:00:10 time: 0.0331 data_time: 0.0017 memory: 866 2024/10/25 06:05:19 - mmengine - INFO - Iter(val) [250/500] eta: 0:00:08 time: 0.0328 data_time: 0.0018 memory: 906 2024/10/25 06:05:21 - mmengine - INFO - Iter(val) [300/500] eta: 0:00:06 time: 0.0330 data_time: 0.0020 memory: 2028 2024/10/25 06:05:22 - mmengine - INFO - Iter(val) [350/500] eta: 0:00:05 time: 0.0337 data_time: 0.0020 memory: 832 2024/10/25 06:05:24 - mmengine - INFO - Iter(val) [400/500] eta: 0:00:03 time: 0.0327 data_time: 0.0017 memory: 904 2024/10/25 06:05:26 - mmengine - INFO - Iter(val) [450/500] eta: 0:00:01 time: 0.0320 data_time: 0.0015 memory: 839 2024/10/25 06:05:27 - mmengine - INFO - Iter(val) [500/500] eta: 0:00:00 time: 0.0326 data_time: 0.0016 memory: 889 2024/10/25 06:05:28 - mmengine - INFO - per class results: 2024/10/25 06:05:28 - mmengine - INFO - +---------------------+-------+-------+ | Class | IoU | Acc | +---------------------+-------+-------+ | wall | 65.11 | 84.14 | | building | 76.01 | 89.0 | | sky | 88.2 | 93.1 | | floor | 69.24 | 84.36 | | tree | 65.89 | 83.98 | | ceiling | 74.36 | 86.51 | | road | 74.56 | 84.72 | | bed | 75.95 | 84.98 | | windowpane | 47.85 | 64.04 | | grass | 60.65 | 83.62 | | cabinet | 48.54 | 62.82 | | sidewalk | 49.63 | 64.22 | | person | 58.22 | 74.52 | | earth | 25.79 | 34.12 | | door | 28.41 | 38.21 | | table | 39.28 | 55.79 | | mountain | 56.98 | 73.33 | | plant | 41.64 | 55.46 | | curtain | 52.17 | 71.25 | | chair | 34.88 | 45.15 | | car | 68.06 | 80.16 | | water | 36.67 | 48.49 | | painting | 47.46 | 59.42 | | sofa | 49.27 | 76.08 | | shelf | 25.13 | 36.32 | | house | 41.97 | 67.38 | | sea | 48.36 | 83.27 | | mirror | 48.21 | 55.48 | | rug | 46.12 | 61.02 | | field | 18.73 | 29.41 | | armchair | 19.45 | 24.28 | | seat | 38.17 | 62.58 | | fence | 27.85 | 33.81 | | desk | 30.24 | 48.47 | | rock | 25.72 | 30.61 | | wardrobe | 35.67 | 61.96 | | lamp | 24.69 | 30.34 | | bathtub | 65.08 | 80.2 | | railing | 26.16 | 32.86 | | cushion | 30.92 | 48.14 | | base | 15.73 | 29.29 | | box | 11.24 | 15.24 | | column | 31.16 | 39.7 | | signboard | 19.56 | 28.57 | | chest of drawers | 36.7 | 45.1 | | counter | 32.6 | 47.9 | | sand | 25.83 | 47.63 | | sink | 44.92 | 57.98 | | skyscraper | 40.86 | 64.07 | | fireplace | 56.68 | 73.95 | | refrigerator | 59.19 | 71.15 | | grandstand | 30.84 | 51.95 | | path | 21.49 | 29.63 | | stairs | 24.35 | 32.23 | | runway | 56.37 | 74.6 | | case | 45.55 | 60.64 | | pool table | 77.06 | 84.94 | | pillow | 34.54 | 41.6 | | screen door | 54.34 | 65.07 | | stairway | 25.97 | 38.75 | | river | 16.85 | 46.41 | | bridge | 51.31 | 67.11 | | bookcase | 22.47 | 53.95 | | blind | 19.12 | 21.28 | | coffee table | 42.42 | 65.19 | | toilet | 58.29 | 67.76 | | flower | 21.89 | 33.41 | | book | 25.26 | 37.77 | | hill | 3.53 | 4.19 | | bench | 30.15 | 38.99 | | countertop | 35.47 | 43.62 | | stove | 43.55 | 46.24 | | palm | 36.44 | 50.48 | | kitchen island | 22.55 | 59.98 | | computer | 41.04 | 56.52 | | swivel chair | 33.93 | 48.6 | | boat | 40.78 | 70.81 | | bar | 23.77 | 29.0 | | arcade machine | 26.3 | 27.77 | | hovel | 47.01 | 61.21 | | bus | 37.5 | 41.69 | | towel | 32.42 | 38.96 | | light | 5.52 | 5.83 | | truck | 15.5 | 25.26 | | tower | 22.44 | 68.72 | | chandelier | 45.67 | 64.58 | | awning | 11.51 | 14.58 | | streetlight | 2.55 | 2.82 | | booth | 37.24 | 48.06 | | television receiver | 47.91 | 57.25 | | airplane | 31.54 | 45.27 | | dirt track | 1.96 | 2.07 | | apparel | 15.76 | 36.51 | | pole | 2.85 | 3.13 | | land | 0.0 | 0.0 | | bannister | 0.83 | 0.99 | | escalator | 25.58 | 33.84 | | ottoman | 28.95 | 36.43 | | bottle | 3.15 | 3.54 | | buffet | 36.96 | 47.78 | | poster | 6.74 | 7.05 | | stage | 5.06 | 16.77 | | van | 33.8 | 36.15 | | ship | 58.65 | 65.63 | | fountain | 0.94 | 1.0 | | conveyer belt | 45.71 | 82.73 | | canopy | 10.6 | 15.55 | | washer | 54.35 | 56.23 | | plaything | 10.67 | 17.41 | | swimming pool | 58.33 | 79.55 | | stool | 7.47 | 9.02 | | barrel | 26.3 | 54.84 | | basket | 13.51 | 17.94 | | waterfall | 50.84 | 79.13 | | tent | 86.48 | 94.59 | | bag | 0.61 | 0.63 | | minibike | 44.8 | 51.4 | | cradle | 53.25 | 92.49 | | oven | 26.09 | 49.96 | | ball | 0.4 | 0.44 | | food | 7.6 | 8.02 | | step | 8.75 | 10.99 | | tank | 37.76 | 40.82 | | trade name | 3.71 | 3.78 | | microwave | 30.91 | 37.68 | | pot | 21.01 | 24.63 | | animal | 34.28 | 40.24 | | bicycle | 25.56 | 36.12 | | lake | 0.0 | 0.0 | | dishwasher | 34.52 | 50.79 | | screen | 48.33 | 75.84 | | blanket | 0.86 | 0.87 | | sculpture | 41.73 | 45.75 | | hood | 18.01 | 19.25 | | sconce | 2.59 | 2.62 | | vase | 12.96 | 17.77 | | traffic light | 7.74 | 13.22 | | tray | 0.22 | 0.23 | | ashcan | 19.14 | 26.23 | | fan | 23.29 | 30.06 | | pier | 29.81 | 37.52 | | crt screen | 0.0 | 0.0 | | plate | 16.21 | 22.44 | | monitor | 17.31 | 19.12 | | bulletin board | 23.82 | 26.04 | | shower | 0.0 | 0.0 | | radiator | 28.94 | 33.91 | | glass | 0.0 | 0.0 | | clock | 3.02 | 3.25 | | flag | 15.66 | 19.84 | +---------------------+-------+-------+ 2024/10/25 06:05:28 - mmengine - INFO - Iter(val) [500/500] aAcc: 74.2400 mIoU: 31.9200 mAcc: 42.9800 data_time: 0.0018 time: 0.0334 2024/10/25 06:07:07 - mmengine - INFO - Iter(train) [16050/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 10:52:16 time: 1.9662 data_time: 0.0160 memory: 6646 grad_norm: 6.1141 loss: 0.8404 decode.loss_ce: 0.5722 decode.acc_seg: 68.3466 aux.loss_ce: 0.2682 aux.acc_seg: 62.7264 2024/10/25 06:08:44 - mmengine - INFO - Iter(train) [16100/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 10:50:36 time: 1.9510 data_time: 0.0172 memory: 6646 grad_norm: 6.1711 loss: 0.7636 decode.loss_ce: 0.5177 decode.acc_seg: 77.7313 aux.loss_ce: 0.2459 aux.acc_seg: 78.7793 2024/10/25 06:10:22 - mmengine - INFO - Iter(train) [16150/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 10:48:56 time: 1.9533 data_time: 0.0167 memory: 6645 grad_norm: 7.3949 loss: 1.0603 decode.loss_ce: 0.7052 decode.acc_seg: 59.1049 aux.loss_ce: 0.3551 aux.acc_seg: 53.5959 2024/10/25 06:12:00 - mmengine - INFO - Iter(train) [16200/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 10:47:16 time: 1.9646 data_time: 0.0153 memory: 6646 grad_norm: 5.8212 loss: 0.8904 decode.loss_ce: 0.6074 decode.acc_seg: 69.9012 aux.loss_ce: 0.2830 aux.acc_seg: 70.0579 2024/10/25 06:13:37 - mmengine - INFO - Iter(train) [16250/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 10:45:36 time: 1.9480 data_time: 0.0156 memory: 6645 grad_norm: 5.7293 loss: 0.8618 decode.loss_ce: 0.5830 decode.acc_seg: 77.2260 aux.loss_ce: 0.2788 aux.acc_seg: 72.9525 2024/10/25 06:15:15 - mmengine - INFO - Iter(train) [16300/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 10:43:56 time: 1.9588 data_time: 0.0167 memory: 6645 grad_norm: 7.1299 loss: 0.9426 decode.loss_ce: 0.6412 decode.acc_seg: 80.6170 aux.loss_ce: 0.3015 aux.acc_seg: 78.2223 2024/10/25 06:16:53 - mmengine - INFO - Iter(train) [16350/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 10:42:17 time: 1.9525 data_time: 0.0167 memory: 6645 grad_norm: 6.5673 loss: 0.8942 decode.loss_ce: 0.6137 decode.acc_seg: 78.9702 aux.loss_ce: 0.2806 aux.acc_seg: 74.6780 2024/10/25 06:18:31 - mmengine - INFO - Iter(train) [16400/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 10:40:37 time: 1.9585 data_time: 0.0155 memory: 6646 grad_norm: 5.9963 loss: 0.8640 decode.loss_ce: 0.5855 decode.acc_seg: 72.5111 aux.loss_ce: 0.2786 aux.acc_seg: 68.9665 2024/10/25 06:20:08 - mmengine - INFO - Iter(train) [16450/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 10:38:57 time: 1.9607 data_time: 0.0167 memory: 6646 grad_norm: 5.3075 loss: 0.7249 decode.loss_ce: 0.4951 decode.acc_seg: 82.7809 aux.loss_ce: 0.2298 aux.acc_seg: 77.6865 2024/10/25 06:21:46 - mmengine - INFO - Iter(train) [16500/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 10:37:17 time: 1.9633 data_time: 0.0161 memory: 6645 grad_norm: 7.4960 loss: 0.9644 decode.loss_ce: 0.6280 decode.acc_seg: 70.1517 aux.loss_ce: 0.3364 aux.acc_seg: 65.4613 2024/10/25 06:23:24 - mmengine - INFO - Iter(train) [16550/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 10:35:38 time: 1.9446 data_time: 0.0163 memory: 6646 grad_norm: 8.9199 loss: 0.9192 decode.loss_ce: 0.6275 decode.acc_seg: 63.2687 aux.loss_ce: 0.2917 aux.acc_seg: 57.5251 2024/10/25 06:25:02 - mmengine - INFO - Iter(train) [16600/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 10:33:58 time: 1.9562 data_time: 0.0174 memory: 6648 grad_norm: 5.1177 loss: 0.8167 decode.loss_ce: 0.5263 decode.acc_seg: 92.7459 aux.loss_ce: 0.2904 aux.acc_seg: 92.8974 2024/10/25 06:26:44 - mmengine - INFO - Iter(train) [16650/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 10:32:35 time: 1.9491 data_time: 0.0171 memory: 6645 grad_norm: 5.8288 loss: 0.7875 decode.loss_ce: 0.5400 decode.acc_seg: 77.4974 aux.loss_ce: 0.2474 aux.acc_seg: 77.8848 2024/10/25 06:28:21 - mmengine - INFO - Iter(train) [16700/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 10:30:55 time: 1.9499 data_time: 0.0168 memory: 6645 grad_norm: 7.5167 loss: 0.7966 decode.loss_ce: 0.5293 decode.acc_seg: 78.5101 aux.loss_ce: 0.2673 aux.acc_seg: 77.3884 2024/10/25 06:29:59 - mmengine - INFO - Iter(train) [16750/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 10:29:15 time: 1.9434 data_time: 0.0163 memory: 6645 grad_norm: 6.9076 loss: 0.9769 decode.loss_ce: 0.6558 decode.acc_seg: 76.2163 aux.loss_ce: 0.3211 aux.acc_seg: 71.9895 2024/10/25 06:31:37 - mmengine - INFO - Iter(train) [16800/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 10:27:37 time: 1.9511 data_time: 0.0168 memory: 6646 grad_norm: 6.8972 loss: 0.9519 decode.loss_ce: 0.6179 decode.acc_seg: 79.2997 aux.loss_ce: 0.3341 aux.acc_seg: 78.7134 2024/10/25 06:33:15 - mmengine - INFO - Iter(train) [16850/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 10:25:57 time: 1.9512 data_time: 0.0168 memory: 6646 grad_norm: 6.8996 loss: 0.8778 decode.loss_ce: 0.5905 decode.acc_seg: 62.9158 aux.loss_ce: 0.2874 aux.acc_seg: 67.7130 2024/10/25 06:34:53 - mmengine - INFO - Iter(train) [16900/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 10:24:17 time: 1.9467 data_time: 0.0168 memory: 6646 grad_norm: 6.5830 loss: 0.7590 decode.loss_ce: 0.5083 decode.acc_seg: 77.8278 aux.loss_ce: 0.2507 aux.acc_seg: 75.4750 2024/10/25 06:36:30 - mmengine - INFO - Iter(train) [16950/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 10:22:36 time: 1.9562 data_time: 0.0175 memory: 6645 grad_norm: 6.7181 loss: 0.7940 decode.loss_ce: 0.5451 decode.acc_seg: 84.0032 aux.loss_ce: 0.2489 aux.acc_seg: 83.4096 2024/10/25 06:38:08 - mmengine - INFO - Exp name: deeplabv3_mobilemamba_b4-80k_ade20k-512x512_20241024_212005 2024/10/25 06:38:08 - mmengine - INFO - Iter(train) [17000/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 10:20:58 time: 1.9525 data_time: 0.0172 memory: 6646 grad_norm: 6.1965 loss: 1.0153 decode.loss_ce: 0.6887 decode.acc_seg: 73.0106 aux.loss_ce: 0.3266 aux.acc_seg: 73.1661 2024/10/25 06:39:46 - mmengine - INFO - Iter(train) [17050/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 10:19:19 time: 1.9526 data_time: 0.0166 memory: 6647 grad_norm: 5.5900 loss: 0.7979 decode.loss_ce: 0.5387 decode.acc_seg: 77.8668 aux.loss_ce: 0.2592 aux.acc_seg: 72.0584 2024/10/25 06:41:24 - mmengine - INFO - Iter(train) [17100/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 10:17:39 time: 1.9503 data_time: 0.0165 memory: 6646 grad_norm: 4.8714 loss: 0.7375 decode.loss_ce: 0.4965 decode.acc_seg: 78.9188 aux.loss_ce: 0.2410 aux.acc_seg: 71.7133 2024/10/25 06:43:01 - mmengine - INFO - Iter(train) [17150/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 10:15:59 time: 1.9528 data_time: 0.0168 memory: 6646 grad_norm: 5.7537 loss: 0.8377 decode.loss_ce: 0.5752 decode.acc_seg: 73.0161 aux.loss_ce: 0.2625 aux.acc_seg: 73.5195 2024/10/25 06:44:39 - mmengine - INFO - Iter(train) [17200/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 10:14:20 time: 1.9512 data_time: 0.0158 memory: 6645 grad_norm: 4.9367 loss: 0.8826 decode.loss_ce: 0.5880 decode.acc_seg: 63.9591 aux.loss_ce: 0.2946 aux.acc_seg: 66.9480 2024/10/25 06:46:17 - mmengine - INFO - Iter(train) [17250/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 10:12:40 time: 1.9526 data_time: 0.0163 memory: 6645 grad_norm: 6.7213 loss: 0.7786 decode.loss_ce: 0.5262 decode.acc_seg: 72.0998 aux.loss_ce: 0.2525 aux.acc_seg: 71.9226 2024/10/25 06:47:55 - mmengine - INFO - Iter(train) [17300/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 10:11:01 time: 1.9551 data_time: 0.0180 memory: 6646 grad_norm: 5.4233 loss: 0.8833 decode.loss_ce: 0.6036 decode.acc_seg: 81.9725 aux.loss_ce: 0.2797 aux.acc_seg: 79.9158 2024/10/25 06:49:33 - mmengine - INFO - Iter(train) [17350/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 10:09:22 time: 1.9533 data_time: 0.0162 memory: 6646 grad_norm: 6.6964 loss: 0.8826 decode.loss_ce: 0.6029 decode.acc_seg: 87.4180 aux.loss_ce: 0.2798 aux.acc_seg: 81.1130 2024/10/25 06:51:11 - mmengine - INFO - Iter(train) [17400/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 10:07:44 time: 1.9503 data_time: 0.0165 memory: 6645 grad_norm: 5.1589 loss: 0.9047 decode.loss_ce: 0.5914 decode.acc_seg: 77.0377 aux.loss_ce: 0.3133 aux.acc_seg: 73.4485 2024/10/25 06:52:48 - mmengine - INFO - Iter(train) [17450/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 10:06:03 time: 1.9468 data_time: 0.0168 memory: 6647 grad_norm: 5.9782 loss: 0.8921 decode.loss_ce: 0.6148 decode.acc_seg: 74.4123 aux.loss_ce: 0.2774 aux.acc_seg: 75.4822 2024/10/25 06:54:26 - mmengine - INFO - Iter(train) [17500/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 10:04:23 time: 1.9552 data_time: 0.0179 memory: 6645 grad_norm: 4.8189 loss: 0.8630 decode.loss_ce: 0.5740 decode.acc_seg: 81.2347 aux.loss_ce: 0.2890 aux.acc_seg: 76.5939 2024/10/25 06:56:03 - mmengine - INFO - Iter(train) [17550/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 10:02:43 time: 1.9482 data_time: 0.0166 memory: 6647 grad_norm: 6.4057 loss: 0.9061 decode.loss_ce: 0.5988 decode.acc_seg: 79.0727 aux.loss_ce: 0.3074 aux.acc_seg: 77.2593 2024/10/25 06:57:42 - mmengine - INFO - Iter(train) [17600/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 10:01:05 time: 1.9467 data_time: 0.0175 memory: 6646 grad_norm: 7.2886 loss: 0.7958 decode.loss_ce: 0.5447 decode.acc_seg: 85.3322 aux.loss_ce: 0.2511 aux.acc_seg: 84.1922 2024/10/25 06:59:19 - mmengine - INFO - Iter(train) [17650/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 9:59:26 time: 1.9494 data_time: 0.0178 memory: 6645 grad_norm: 8.3321 loss: 0.8516 decode.loss_ce: 0.5684 decode.acc_seg: 80.0836 aux.loss_ce: 0.2832 aux.acc_seg: 77.4938 2024/10/25 07:00:57 - mmengine - INFO - Iter(train) [17700/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 9:57:46 time: 1.9499 data_time: 0.0165 memory: 6647 grad_norm: 3.8053 loss: 0.8126 decode.loss_ce: 0.5405 decode.acc_seg: 87.0867 aux.loss_ce: 0.2721 aux.acc_seg: 78.5823 2024/10/25 07:02:35 - mmengine - INFO - Iter(train) [17750/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 9:56:06 time: 1.9498 data_time: 0.0165 memory: 6645 grad_norm: 6.4107 loss: 0.8220 decode.loss_ce: 0.5583 decode.acc_seg: 77.2663 aux.loss_ce: 0.2637 aux.acc_seg: 77.4586 2024/10/25 07:04:12 - mmengine - INFO - Iter(train) [17800/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 9:54:26 time: 1.9526 data_time: 0.0173 memory: 6646 grad_norm: 5.5216 loss: 0.7463 decode.loss_ce: 0.4877 decode.acc_seg: 81.1578 aux.loss_ce: 0.2586 aux.acc_seg: 76.4503 2024/10/25 07:05:50 - mmengine - INFO - Iter(train) [17850/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 9:52:46 time: 1.9489 data_time: 0.0174 memory: 6645 grad_norm: 5.9361 loss: 0.8191 decode.loss_ce: 0.5570 decode.acc_seg: 83.2893 aux.loss_ce: 0.2621 aux.acc_seg: 73.9269 2024/10/25 07:07:28 - mmengine - INFO - Iter(train) [17900/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 9:51:08 time: 1.9588 data_time: 0.0171 memory: 6645 grad_norm: 7.2156 loss: 0.8928 decode.loss_ce: 0.6115 decode.acc_seg: 75.0037 aux.loss_ce: 0.2814 aux.acc_seg: 67.9696 2024/10/25 07:09:05 - mmengine - INFO - Iter(train) [17950/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 9:49:27 time: 1.9481 data_time: 0.0167 memory: 6645 grad_norm: 7.0190 loss: 0.9086 decode.loss_ce: 0.5901 decode.acc_seg: 74.3106 aux.loss_ce: 0.3185 aux.acc_seg: 70.0023 2024/10/25 07:10:44 - mmengine - INFO - Exp name: deeplabv3_mobilemamba_b4-80k_ade20k-512x512_20241024_212005 2024/10/25 07:10:44 - mmengine - INFO - Iter(train) [18000/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 9:47:50 time: 1.9515 data_time: 0.0168 memory: 6645 grad_norm: 5.8658 loss: 0.8002 decode.loss_ce: 0.5301 decode.acc_seg: 82.1155 aux.loss_ce: 0.2700 aux.acc_seg: 72.8438 2024/10/25 07:12:22 - mmengine - INFO - Iter(train) [18050/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 9:46:11 time: 1.9445 data_time: 0.0161 memory: 6645 grad_norm: 6.0913 loss: 0.8274 decode.loss_ce: 0.5759 decode.acc_seg: 71.1937 aux.loss_ce: 0.2514 aux.acc_seg: 72.0604 2024/10/25 07:13:59 - mmengine - INFO - Iter(train) [18100/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 9:44:31 time: 1.9457 data_time: 0.0167 memory: 6645 grad_norm: 5.3858 loss: 0.9262 decode.loss_ce: 0.6165 decode.acc_seg: 84.8466 aux.loss_ce: 0.3097 aux.acc_seg: 81.0072 2024/10/25 07:15:37 - mmengine - INFO - Iter(train) [18150/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 9:42:51 time: 1.9385 data_time: 0.0163 memory: 6646 grad_norm: 7.0018 loss: 0.9437 decode.loss_ce: 0.6401 decode.acc_seg: 73.8440 aux.loss_ce: 0.3036 aux.acc_seg: 66.7146 2024/10/25 07:17:14 - mmengine - INFO - Iter(train) [18200/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 9:41:10 time: 1.9467 data_time: 0.0167 memory: 6646 grad_norm: 8.1644 loss: 0.9777 decode.loss_ce: 0.6477 decode.acc_seg: 71.6079 aux.loss_ce: 0.3300 aux.acc_seg: 71.9299 2024/10/25 07:18:52 - mmengine - INFO - Iter(train) [18250/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 9:39:30 time: 1.9414 data_time: 0.0169 memory: 6645 grad_norm: 5.8762 loss: 0.8774 decode.loss_ce: 0.6051 decode.acc_seg: 63.3763 aux.loss_ce: 0.2724 aux.acc_seg: 63.7877 2024/10/25 07:20:29 - mmengine - INFO - Iter(train) [18300/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 9:37:50 time: 1.9466 data_time: 0.0165 memory: 6646 grad_norm: 6.5599 loss: 0.7753 decode.loss_ce: 0.5190 decode.acc_seg: 89.0452 aux.loss_ce: 0.2563 aux.acc_seg: 84.2682 2024/10/25 07:22:07 - mmengine - INFO - Iter(train) [18350/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 9:36:09 time: 1.9534 data_time: 0.0168 memory: 6647 grad_norm: 4.8080 loss: 0.8109 decode.loss_ce: 0.5457 decode.acc_seg: 81.1885 aux.loss_ce: 0.2652 aux.acc_seg: 76.3527 2024/10/25 07:23:44 - mmengine - INFO - Iter(train) [18400/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 9:34:30 time: 1.9557 data_time: 0.0154 memory: 6645 grad_norm: 4.8472 loss: 0.8790 decode.loss_ce: 0.5991 decode.acc_seg: 79.4314 aux.loss_ce: 0.2799 aux.acc_seg: 78.9881 2024/10/25 07:25:22 - mmengine - INFO - Iter(train) [18450/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 9:32:51 time: 1.9504 data_time: 0.0161 memory: 6646 grad_norm: 5.9766 loss: 0.7580 decode.loss_ce: 0.5040 decode.acc_seg: 87.0956 aux.loss_ce: 0.2540 aux.acc_seg: 77.0118 2024/10/25 07:26:59 - mmengine - INFO - Iter(train) [18500/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 9:31:10 time: 1.9451 data_time: 0.0172 memory: 6645 grad_norm: 5.1719 loss: 0.8462 decode.loss_ce: 0.5665 decode.acc_seg: 75.3070 aux.loss_ce: 0.2797 aux.acc_seg: 70.4063 2024/10/25 07:28:37 - mmengine - INFO - Iter(train) [18550/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 9:29:30 time: 1.9477 data_time: 0.0171 memory: 6646 grad_norm: 5.4097 loss: 0.8482 decode.loss_ce: 0.5646 decode.acc_seg: 80.9053 aux.loss_ce: 0.2836 aux.acc_seg: 77.4503 2024/10/25 07:30:14 - mmengine - INFO - Iter(train) [18600/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 9:27:49 time: 1.9571 data_time: 0.0166 memory: 6646 grad_norm: 5.2226 loss: 0.8826 decode.loss_ce: 0.5932 decode.acc_seg: 77.6444 aux.loss_ce: 0.2894 aux.acc_seg: 77.8817 2024/10/25 07:31:52 - mmengine - INFO - Iter(train) [18650/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 9:26:10 time: 1.9598 data_time: 0.0155 memory: 6646 grad_norm: 9.3456 loss: 0.9089 decode.loss_ce: 0.6006 decode.acc_seg: 77.5407 aux.loss_ce: 0.3083 aux.acc_seg: 81.0464 2024/10/25 07:33:29 - mmengine - INFO - Iter(train) [18700/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 9:24:29 time: 1.9482 data_time: 0.0165 memory: 6646 grad_norm: 4.2334 loss: 0.7498 decode.loss_ce: 0.5075 decode.acc_seg: 88.0946 aux.loss_ce: 0.2422 aux.acc_seg: 85.1405 2024/10/25 07:35:07 - mmengine - INFO - Iter(train) [18750/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 9:22:51 time: 1.9485 data_time: 0.0169 memory: 6646 grad_norm: 6.9897 loss: 0.8453 decode.loss_ce: 0.5750 decode.acc_seg: 85.3516 aux.loss_ce: 0.2703 aux.acc_seg: 78.4261 2024/10/25 07:36:45 - mmengine - INFO - Iter(train) [18800/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 9:21:10 time: 1.9433 data_time: 0.0156 memory: 6646 grad_norm: 6.2746 loss: 0.7051 decode.loss_ce: 0.4704 decode.acc_seg: 81.2094 aux.loss_ce: 0.2347 aux.acc_seg: 78.1656 2024/10/25 07:38:22 - mmengine - INFO - Iter(train) [18850/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 9:19:30 time: 1.9421 data_time: 0.0166 memory: 6645 grad_norm: 8.0397 loss: 0.9372 decode.loss_ce: 0.6301 decode.acc_seg: 85.6076 aux.loss_ce: 0.3071 aux.acc_seg: 85.6783 2024/10/25 07:40:00 - mmengine - INFO - Iter(train) [18900/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 9:17:50 time: 1.9540 data_time: 0.0172 memory: 6645 grad_norm: 6.3948 loss: 0.8738 decode.loss_ce: 0.5779 decode.acc_seg: 81.7971 aux.loss_ce: 0.2959 aux.acc_seg: 82.4203 2024/10/25 07:41:38 - mmengine - INFO - Iter(train) [18950/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 9:16:12 time: 1.9632 data_time: 0.0162 memory: 6646 grad_norm: 6.7678 loss: 0.7093 decode.loss_ce: 0.4818 decode.acc_seg: 79.0292 aux.loss_ce: 0.2274 aux.acc_seg: 74.7469 2024/10/25 07:43:15 - mmengine - INFO - Exp name: deeplabv3_mobilemamba_b4-80k_ade20k-512x512_20241024_212005 2024/10/25 07:43:15 - mmengine - INFO - Iter(train) [19000/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 9:14:33 time: 1.9639 data_time: 0.0176 memory: 6645 grad_norm: 5.4153 loss: 0.7920 decode.loss_ce: 0.5370 decode.acc_seg: 83.9718 aux.loss_ce: 0.2550 aux.acc_seg: 82.7384 2024/10/25 07:44:53 - mmengine - INFO - Iter(train) [19050/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 9:12:53 time: 1.9487 data_time: 0.0180 memory: 6644 grad_norm: 5.7619 loss: 0.6683 decode.loss_ce: 0.4387 decode.acc_seg: 88.4785 aux.loss_ce: 0.2297 aux.acc_seg: 86.7497 2024/10/25 07:46:30 - mmengine - INFO - Iter(train) [19100/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 9:11:12 time: 1.9399 data_time: 0.0172 memory: 6646 grad_norm: 6.8711 loss: 0.8745 decode.loss_ce: 0.6007 decode.acc_seg: 78.0399 aux.loss_ce: 0.2738 aux.acc_seg: 71.4604 2024/10/25 07:48:07 - mmengine - INFO - Iter(train) [19150/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 9:09:30 time: 1.9392 data_time: 0.0167 memory: 6646 grad_norm: 5.2928 loss: 0.8020 decode.loss_ce: 0.5320 decode.acc_seg: 80.6218 aux.loss_ce: 0.2700 aux.acc_seg: 79.9980 2024/10/25 07:49:44 - mmengine - INFO - Iter(train) [19200/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 9:07:49 time: 1.9373 data_time: 0.0170 memory: 6645 grad_norm: 6.0333 loss: 0.7422 decode.loss_ce: 0.4993 decode.acc_seg: 86.2243 aux.loss_ce: 0.2429 aux.acc_seg: 77.3510 2024/10/25 07:51:21 - mmengine - INFO - Iter(train) [19250/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 9:06:08 time: 1.9528 data_time: 0.0169 memory: 6645 grad_norm: 7.0752 loss: 0.8511 decode.loss_ce: 0.5662 decode.acc_seg: 80.3327 aux.loss_ce: 0.2849 aux.acc_seg: 75.8684 2024/10/25 07:52:59 - mmengine - INFO - Iter(train) [19300/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 9:04:28 time: 1.9418 data_time: 0.0164 memory: 6645 grad_norm: 6.1110 loss: 0.7129 decode.loss_ce: 0.4877 decode.acc_seg: 86.4206 aux.loss_ce: 0.2252 aux.acc_seg: 86.3540 2024/10/25 07:54:37 - mmengine - INFO - Iter(train) [19350/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 9:02:49 time: 1.9314 data_time: 0.0169 memory: 6646 grad_norm: 6.6451 loss: 0.8651 decode.loss_ce: 0.5983 decode.acc_seg: 84.6981 aux.loss_ce: 0.2668 aux.acc_seg: 84.3506 2024/10/25 07:56:14 - mmengine - INFO - Iter(train) [19400/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 9:01:09 time: 1.9433 data_time: 0.0170 memory: 6645 grad_norm: 4.3543 loss: 0.7620 decode.loss_ce: 0.5122 decode.acc_seg: 80.9926 aux.loss_ce: 0.2499 aux.acc_seg: 71.6641 2024/10/25 07:57:51 - mmengine - INFO - Iter(train) [19450/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 8:59:28 time: 1.9322 data_time: 0.0167 memory: 6645 grad_norm: 5.3586 loss: 0.7966 decode.loss_ce: 0.5294 decode.acc_seg: 79.8218 aux.loss_ce: 0.2672 aux.acc_seg: 77.2992 2024/10/25 07:59:28 - mmengine - INFO - Iter(train) [19500/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 8:57:45 time: 1.9294 data_time: 0.0159 memory: 6645 grad_norm: 5.9041 loss: 0.9386 decode.loss_ce: 0.6187 decode.acc_seg: 67.2478 aux.loss_ce: 0.3198 aux.acc_seg: 60.4795 2024/10/25 08:01:05 - mmengine - INFO - Iter(train) [19550/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 8:56:04 time: 1.9334 data_time: 0.0163 memory: 6647 grad_norm: 5.7327 loss: 0.7897 decode.loss_ce: 0.5366 decode.acc_seg: 72.4556 aux.loss_ce: 0.2531 aux.acc_seg: 68.7548 2024/10/25 08:02:43 - mmengine - INFO - Iter(train) [19600/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 8:54:26 time: 1.9323 data_time: 0.0157 memory: 6646 grad_norm: 4.6707 loss: 0.7615 decode.loss_ce: 0.5076 decode.acc_seg: 86.9552 aux.loss_ce: 0.2538 aux.acc_seg: 81.0305 2024/10/25 08:04:20 - mmengine - INFO - Iter(train) [19650/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 8:52:44 time: 1.9318 data_time: 0.0164 memory: 6647 grad_norm: 6.2401 loss: 0.8554 decode.loss_ce: 0.5637 decode.acc_seg: 79.9675 aux.loss_ce: 0.2917 aux.acc_seg: 67.9940 2024/10/25 08:05:56 - mmengine - INFO - Iter(train) [19700/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 8:51:01 time: 1.9313 data_time: 0.0160 memory: 6645 grad_norm: 4.3125 loss: 0.7375 decode.loss_ce: 0.4840 decode.acc_seg: 83.0102 aux.loss_ce: 0.2535 aux.acc_seg: 74.1878 2024/10/25 08:07:33 - mmengine - INFO - Iter(train) [19750/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 8:49:19 time: 1.9344 data_time: 0.0158 memory: 6645 grad_norm: 6.4625 loss: 0.8212 decode.loss_ce: 0.5657 decode.acc_seg: 87.1813 aux.loss_ce: 0.2555 aux.acc_seg: 87.1943 2024/10/25 08:09:10 - mmengine - INFO - Iter(train) [19800/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 8:47:37 time: 1.9308 data_time: 0.0160 memory: 6646 grad_norm: 6.9265 loss: 0.8965 decode.loss_ce: 0.6055 decode.acc_seg: 79.0521 aux.loss_ce: 0.2909 aux.acc_seg: 70.1023 2024/10/25 08:10:47 - mmengine - INFO - Iter(train) [19850/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 8:45:55 time: 1.9333 data_time: 0.0160 memory: 6645 grad_norm: 4.5595 loss: 0.7797 decode.loss_ce: 0.5253 decode.acc_seg: 86.5514 aux.loss_ce: 0.2544 aux.acc_seg: 74.3905 2024/10/25 08:12:23 - mmengine - INFO - Iter(train) [19900/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 8:44:13 time: 1.9326 data_time: 0.0162 memory: 6645 grad_norm: 5.4199 loss: 0.8149 decode.loss_ce: 0.5478 decode.acc_seg: 77.6145 aux.loss_ce: 0.2671 aux.acc_seg: 73.4656 2024/10/25 08:14:00 - mmengine - INFO - Iter(train) [19950/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 8:42:32 time: 1.9504 data_time: 0.0154 memory: 6645 grad_norm: 5.2393 loss: 0.6495 decode.loss_ce: 0.4387 decode.acc_seg: 85.4266 aux.loss_ce: 0.2108 aux.acc_seg: 81.8285 2024/10/25 08:15:38 - mmengine - INFO - Exp name: deeplabv3_mobilemamba_b4-80k_ade20k-512x512_20241024_212005 2024/10/25 08:15:38 - mmengine - INFO - Iter(train) [20000/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 8:40:53 time: 1.9503 data_time: 0.0155 memory: 6645 grad_norm: 4.9431 loss: 0.7570 decode.loss_ce: 0.5088 decode.acc_seg: 78.8489 aux.loss_ce: 0.2482 aux.acc_seg: 81.7079 2024/10/25 08:17:15 - mmengine - INFO - Iter(train) [20050/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 8:39:13 time: 1.9327 data_time: 0.0168 memory: 6646 grad_norm: 6.1938 loss: 0.7690 decode.loss_ce: 0.5156 decode.acc_seg: 84.5279 aux.loss_ce: 0.2535 aux.acc_seg: 82.1070 2024/10/25 08:18:53 - mmengine - INFO - Iter(train) [20100/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 8:37:32 time: 1.9324 data_time: 0.0169 memory: 6646 grad_norm: 6.3695 loss: 0.8340 decode.loss_ce: 0.5629 decode.acc_seg: 85.0291 aux.loss_ce: 0.2711 aux.acc_seg: 85.2774 2024/10/25 08:20:30 - mmengine - INFO - Iter(train) [20150/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 8:35:51 time: 1.9393 data_time: 0.0167 memory: 6646 grad_norm: 6.1037 loss: 0.8067 decode.loss_ce: 0.5428 decode.acc_seg: 83.2737 aux.loss_ce: 0.2639 aux.acc_seg: 82.9711 2024/10/25 08:22:06 - mmengine - INFO - Iter(train) [20200/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 8:34:09 time: 1.9322 data_time: 0.0161 memory: 6645 grad_norm: 5.5904 loss: 0.8974 decode.loss_ce: 0.5973 decode.acc_seg: 73.8400 aux.loss_ce: 0.3002 aux.acc_seg: 73.1445 2024/10/25 08:23:43 - mmengine - INFO - Iter(train) [20250/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 8:32:28 time: 1.9342 data_time: 0.0159 memory: 6646 grad_norm: 4.6724 loss: 0.7400 decode.loss_ce: 0.5034 decode.acc_seg: 77.3228 aux.loss_ce: 0.2366 aux.acc_seg: 77.6128 2024/10/25 08:25:20 - mmengine - INFO - Iter(train) [20300/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 8:30:47 time: 1.9323 data_time: 0.0159 memory: 6647 grad_norm: 5.5433 loss: 0.7664 decode.loss_ce: 0.5250 decode.acc_seg: 84.2537 aux.loss_ce: 0.2414 aux.acc_seg: 81.8431 2024/10/25 08:26:57 - mmengine - INFO - Iter(train) [20350/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 8:29:05 time: 1.9307 data_time: 0.0156 memory: 6646 grad_norm: 4.5368 loss: 0.8092 decode.loss_ce: 0.5459 decode.acc_seg: 82.8821 aux.loss_ce: 0.2633 aux.acc_seg: 81.5916 2024/10/25 08:28:35 - mmengine - INFO - Iter(train) [20400/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 8:27:25 time: 1.9508 data_time: 0.0134 memory: 6645 grad_norm: 4.1201 loss: 0.6758 decode.loss_ce: 0.4644 decode.acc_seg: 80.4356 aux.loss_ce: 0.2114 aux.acc_seg: 80.4701 2024/10/25 08:30:11 - mmengine - INFO - Iter(train) [20450/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 8:25:44 time: 1.9312 data_time: 0.0169 memory: 6645 grad_norm: 6.0851 loss: 0.8472 decode.loss_ce: 0.5651 decode.acc_seg: 81.5571 aux.loss_ce: 0.2821 aux.acc_seg: 76.6656 2024/10/25 08:31:48 - mmengine - INFO - Iter(train) [20500/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 8:24:03 time: 1.9387 data_time: 0.0154 memory: 6646 grad_norm: 5.7387 loss: 0.7048 decode.loss_ce: 0.4716 decode.acc_seg: 88.3368 aux.loss_ce: 0.2332 aux.acc_seg: 85.8217 2024/10/25 08:33:26 - mmengine - INFO - Iter(train) [20550/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 8:22:22 time: 1.9468 data_time: 0.0156 memory: 6645 grad_norm: 4.0444 loss: 0.8852 decode.loss_ce: 0.6011 decode.acc_seg: 74.5231 aux.loss_ce: 0.2841 aux.acc_seg: 75.8990 2024/10/25 08:35:03 - mmengine - INFO - Iter(train) [20600/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 8:20:43 time: 1.9582 data_time: 0.0148 memory: 6645 grad_norm: 8.0618 loss: 0.9287 decode.loss_ce: 0.6086 decode.acc_seg: 80.9626 aux.loss_ce: 0.3200 aux.acc_seg: 78.1386 2024/10/25 08:36:43 - mmengine - INFO - Iter(train) [20650/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 8:19:11 time: 1.9405 data_time: 0.0171 memory: 6647 grad_norm: 6.0560 loss: 0.8229 decode.loss_ce: 0.5266 decode.acc_seg: 78.5149 aux.loss_ce: 0.2963 aux.acc_seg: 70.3697 2024/10/25 08:38:21 - mmengine - INFO - Iter(train) [20700/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 8:17:32 time: 1.9439 data_time: 0.0169 memory: 6645 grad_norm: 8.1720 loss: 0.8491 decode.loss_ce: 0.5663 decode.acc_seg: 76.3914 aux.loss_ce: 0.2828 aux.acc_seg: 70.1045 2024/10/25 08:39:59 - mmengine - INFO - Iter(train) [20750/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 8:15:53 time: 1.9480 data_time: 0.0173 memory: 6645 grad_norm: 8.1161 loss: 0.9394 decode.loss_ce: 0.6058 decode.acc_seg: 82.7947 aux.loss_ce: 0.3336 aux.acc_seg: 69.0000 2024/10/25 08:41:36 - mmengine - INFO - Iter(train) [20800/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 8:14:13 time: 1.9495 data_time: 0.0158 memory: 6647 grad_norm: 5.5032 loss: 0.8323 decode.loss_ce: 0.5590 decode.acc_seg: 86.2561 aux.loss_ce: 0.2733 aux.acc_seg: 84.4852 2024/10/25 08:43:13 - mmengine - INFO - Iter(train) [20850/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 8:12:33 time: 1.9497 data_time: 0.0162 memory: 6645 grad_norm: 5.8565 loss: 0.9215 decode.loss_ce: 0.6145 decode.acc_seg: 76.4099 aux.loss_ce: 0.3069 aux.acc_seg: 73.0079 2024/10/25 08:44:51 - mmengine - INFO - Iter(train) [20900/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 8:10:54 time: 1.9469 data_time: 0.0161 memory: 6645 grad_norm: 6.4255 loss: 0.7415 decode.loss_ce: 0.4967 decode.acc_seg: 82.3994 aux.loss_ce: 0.2448 aux.acc_seg: 80.8214 2024/10/25 08:46:28 - mmengine - INFO - Iter(train) [20950/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 8:09:14 time: 1.9428 data_time: 0.0165 memory: 6645 grad_norm: 7.6847 loss: 0.7817 decode.loss_ce: 0.5250 decode.acc_seg: 81.9741 aux.loss_ce: 0.2566 aux.acc_seg: 76.2915 2024/10/25 08:48:06 - mmengine - INFO - Exp name: deeplabv3_mobilemamba_b4-80k_ade20k-512x512_20241024_212005 2024/10/25 08:48:06 - mmengine - INFO - Iter(train) [21000/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 8:07:34 time: 1.9488 data_time: 0.0159 memory: 6647 grad_norm: 5.7774 loss: 0.7878 decode.loss_ce: 0.5222 decode.acc_seg: 77.7844 aux.loss_ce: 0.2657 aux.acc_seg: 71.2647 2024/10/25 08:49:43 - mmengine - INFO - Iter(train) [21050/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 8:05:54 time: 1.9469 data_time: 0.0169 memory: 6646 grad_norm: 6.5262 loss: 0.7967 decode.loss_ce: 0.5236 decode.acc_seg: 92.1045 aux.loss_ce: 0.2731 aux.acc_seg: 85.6595 2024/10/25 08:51:20 - mmengine - INFO - Iter(train) [21100/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 8:04:14 time: 1.9484 data_time: 0.0168 memory: 6645 grad_norm: 4.5178 loss: 0.7180 decode.loss_ce: 0.4916 decode.acc_seg: 80.3514 aux.loss_ce: 0.2264 aux.acc_seg: 77.5302 2024/10/25 08:52:58 - mmengine - INFO - Iter(train) [21150/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 8:02:34 time: 1.9437 data_time: 0.0175 memory: 6645 grad_norm: 4.2683 loss: 0.7190 decode.loss_ce: 0.4865 decode.acc_seg: 84.2942 aux.loss_ce: 0.2324 aux.acc_seg: 85.8064 2024/10/25 08:54:35 - mmengine - INFO - Iter(train) [21200/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 8:00:54 time: 1.9430 data_time: 0.0172 memory: 6646 grad_norm: 6.0965 loss: 0.7192 decode.loss_ce: 0.4762 decode.acc_seg: 74.7817 aux.loss_ce: 0.2430 aux.acc_seg: 63.9238 2024/10/25 08:56:12 - mmengine - INFO - Iter(train) [21250/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 7:59:14 time: 1.9478 data_time: 0.0172 memory: 6646 grad_norm: 7.7568 loss: 0.8506 decode.loss_ce: 0.5641 decode.acc_seg: 76.8621 aux.loss_ce: 0.2865 aux.acc_seg: 69.4461 2024/10/25 08:57:50 - mmengine - INFO - Iter(train) [21300/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 7:57:35 time: 1.9493 data_time: 0.0176 memory: 6646 grad_norm: 5.7348 loss: 0.7710 decode.loss_ce: 0.5094 decode.acc_seg: 75.4665 aux.loss_ce: 0.2615 aux.acc_seg: 74.8837 2024/10/25 08:59:27 - mmengine - INFO - Iter(train) [21350/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 7:55:56 time: 1.9478 data_time: 0.0160 memory: 6645 grad_norm: 5.7826 loss: 0.7528 decode.loss_ce: 0.4961 decode.acc_seg: 83.6994 aux.loss_ce: 0.2567 aux.acc_seg: 69.6258 2024/10/25 09:01:05 - mmengine - INFO - Iter(train) [21400/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 7:54:17 time: 1.9504 data_time: 0.0158 memory: 6646 grad_norm: 8.5443 loss: 0.7951 decode.loss_ce: 0.5355 decode.acc_seg: 78.8006 aux.loss_ce: 0.2596 aux.acc_seg: 80.2056 2024/10/25 09:02:44 - mmengine - INFO - Iter(train) [21450/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 7:52:42 time: 1.9483 data_time: 0.0176 memory: 6646 grad_norm: 5.6998 loss: 0.7114 decode.loss_ce: 0.4753 decode.acc_seg: 79.5001 aux.loss_ce: 0.2360 aux.acc_seg: 79.9381 2024/10/25 09:04:22 - mmengine - INFO - Iter(train) [21500/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 7:51:03 time: 1.9469 data_time: 0.0171 memory: 6645 grad_norm: 4.7486 loss: 0.7669 decode.loss_ce: 0.5136 decode.acc_seg: 80.6602 aux.loss_ce: 0.2533 aux.acc_seg: 68.9306 2024/10/25 09:05:59 - mmengine - INFO - Iter(train) [21550/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 7:49:24 time: 1.9454 data_time: 0.0179 memory: 6647 grad_norm: 6.6230 loss: 0.9443 decode.loss_ce: 0.6239 decode.acc_seg: 77.5213 aux.loss_ce: 0.3204 aux.acc_seg: 75.3382 2024/10/25 09:07:37 - mmengine - INFO - Iter(train) [21600/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 7:47:44 time: 1.9473 data_time: 0.0170 memory: 6647 grad_norm: 4.8747 loss: 0.7559 decode.loss_ce: 0.4999 decode.acc_seg: 78.1893 aux.loss_ce: 0.2560 aux.acc_seg: 69.1651 2024/10/25 09:09:14 - mmengine - INFO - Iter(train) [21650/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 7:46:05 time: 1.9573 data_time: 0.0166 memory: 6645 grad_norm: 6.1095 loss: 0.7360 decode.loss_ce: 0.4935 decode.acc_seg: 85.0677 aux.loss_ce: 0.2425 aux.acc_seg: 75.6747 2024/10/25 09:10:52 - mmengine - INFO - Iter(train) [21700/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 7:44:25 time: 1.9417 data_time: 0.0170 memory: 6646 grad_norm: 5.8744 loss: 0.6084 decode.loss_ce: 0.4122 decode.acc_seg: 79.1096 aux.loss_ce: 0.1962 aux.acc_seg: 79.3190 2024/10/25 09:12:29 - mmengine - INFO - Iter(train) [21750/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 7:42:46 time: 1.9518 data_time: 0.0179 memory: 6645 grad_norm: 7.4111 loss: 0.7878 decode.loss_ce: 0.5364 decode.acc_seg: 81.6836 aux.loss_ce: 0.2514 aux.acc_seg: 67.5846 2024/10/25 09:14:07 - mmengine - INFO - Iter(train) [21800/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 7:41:07 time: 1.9467 data_time: 0.0179 memory: 6646 grad_norm: 6.3239 loss: 0.9629 decode.loss_ce: 0.6264 decode.acc_seg: 73.8884 aux.loss_ce: 0.3365 aux.acc_seg: 74.2179 2024/10/25 09:15:45 - mmengine - INFO - Iter(train) [21850/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 7:39:28 time: 1.9561 data_time: 0.0160 memory: 6647 grad_norm: 5.9749 loss: 0.6961 decode.loss_ce: 0.4621 decode.acc_seg: 76.5886 aux.loss_ce: 0.2341 aux.acc_seg: 71.8901 2024/10/25 09:17:22 - mmengine - INFO - Iter(train) [21900/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 7:37:49 time: 1.9496 data_time: 0.0183 memory: 6646 grad_norm: 7.0408 loss: 0.8280 decode.loss_ce: 0.5503 decode.acc_seg: 70.3519 aux.loss_ce: 0.2777 aux.acc_seg: 63.1779 2024/10/25 09:18:59 - mmengine - INFO - Iter(train) [21950/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 7:36:09 time: 1.9443 data_time: 0.0178 memory: 6647 grad_norm: 5.9113 loss: 0.8615 decode.loss_ce: 0.5623 decode.acc_seg: 79.1772 aux.loss_ce: 0.2992 aux.acc_seg: 72.4521 2024/10/25 09:20:37 - mmengine - INFO - Exp name: deeplabv3_mobilemamba_b4-80k_ade20k-512x512_20241024_212005 2024/10/25 09:20:37 - mmengine - INFO - Iter(train) [22000/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 7:34:29 time: 1.9511 data_time: 0.0153 memory: 6647 grad_norm: 5.0990 loss: 0.7442 decode.loss_ce: 0.5075 decode.acc_seg: 72.6330 aux.loss_ce: 0.2368 aux.acc_seg: 73.1807 2024/10/25 09:22:15 - mmengine - INFO - Iter(train) [22050/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 7:32:51 time: 1.9472 data_time: 0.0162 memory: 6645 grad_norm: 5.3331 loss: 0.7132 decode.loss_ce: 0.4896 decode.acc_seg: 65.9873 aux.loss_ce: 0.2235 aux.acc_seg: 60.6832 2024/10/25 09:23:52 - mmengine - INFO - Iter(train) [22100/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 7:31:12 time: 1.9412 data_time: 0.0161 memory: 6646 grad_norm: 6.1974 loss: 0.7228 decode.loss_ce: 0.4734 decode.acc_seg: 83.5828 aux.loss_ce: 0.2494 aux.acc_seg: 84.5423 2024/10/25 09:25:30 - mmengine - INFO - Iter(train) [22150/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 7:29:32 time: 1.9521 data_time: 0.0155 memory: 6645 grad_norm: 6.2035 loss: 0.8547 decode.loss_ce: 0.5886 decode.acc_seg: 70.5252 aux.loss_ce: 0.2661 aux.acc_seg: 66.4388 2024/10/25 09:27:07 - mmengine - INFO - Iter(train) [22200/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 7:27:53 time: 1.9543 data_time: 0.0165 memory: 6646 grad_norm: 5.7117 loss: 0.8380 decode.loss_ce: 0.5753 decode.acc_seg: 71.0626 aux.loss_ce: 0.2627 aux.acc_seg: 69.9154 2024/10/25 09:28:44 - mmengine - INFO - Iter(train) [22250/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 7:26:13 time: 1.9430 data_time: 0.0171 memory: 6646 grad_norm: 6.3889 loss: 0.7414 decode.loss_ce: 0.5093 decode.acc_seg: 75.7482 aux.loss_ce: 0.2321 aux.acc_seg: 70.9532 2024/10/25 09:30:23 - mmengine - INFO - Iter(train) [22300/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 7:24:36 time: 1.9705 data_time: 0.0177 memory: 6645 grad_norm: 4.2857 loss: 0.7195 decode.loss_ce: 0.4760 decode.acc_seg: 81.2332 aux.loss_ce: 0.2435 aux.acc_seg: 75.4511 2024/10/25 09:32:00 - mmengine - INFO - Iter(train) [22350/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 7:22:57 time: 1.9467 data_time: 0.0163 memory: 6646 grad_norm: 4.6912 loss: 0.7829 decode.loss_ce: 0.5310 decode.acc_seg: 77.5590 aux.loss_ce: 0.2520 aux.acc_seg: 74.7063 2024/10/25 09:33:38 - mmengine - INFO - Iter(train) [22400/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 7:21:18 time: 1.9460 data_time: 0.0175 memory: 6646 grad_norm: 4.7293 loss: 0.6602 decode.loss_ce: 0.4530 decode.acc_seg: 80.5260 aux.loss_ce: 0.2072 aux.acc_seg: 76.4154 2024/10/25 09:35:15 - mmengine - INFO - Iter(train) [22450/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 7:19:38 time: 1.9507 data_time: 0.0170 memory: 6646 grad_norm: 6.9389 loss: 0.6851 decode.loss_ce: 0.4537 decode.acc_seg: 81.8918 aux.loss_ce: 0.2313 aux.acc_seg: 83.4032 2024/10/25 09:36:52 - mmengine - INFO - Iter(train) [22500/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 7:17:58 time: 1.9404 data_time: 0.0175 memory: 6645 grad_norm: 7.8649 loss: 0.8247 decode.loss_ce: 0.5390 decode.acc_seg: 73.0908 aux.loss_ce: 0.2858 aux.acc_seg: 63.1387 2024/10/25 09:38:30 - mmengine - INFO - Iter(train) [22550/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 7:16:18 time: 1.9480 data_time: 0.0166 memory: 6646 grad_norm: 5.9540 loss: 0.7453 decode.loss_ce: 0.4870 decode.acc_seg: 87.0437 aux.loss_ce: 0.2583 aux.acc_seg: 89.6360 2024/10/25 09:40:07 - mmengine - INFO - Iter(train) [22600/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 7:14:40 time: 1.9522 data_time: 0.0157 memory: 6645 grad_norm: 6.1826 loss: 0.7571 decode.loss_ce: 0.5107 decode.acc_seg: 78.8778 aux.loss_ce: 0.2464 aux.acc_seg: 75.1549 2024/10/25 09:41:45 - mmengine - INFO - Iter(train) [22650/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 7:13:02 time: 1.9555 data_time: 0.0163 memory: 6647 grad_norm: 5.2119 loss: 0.7679 decode.loss_ce: 0.5205 decode.acc_seg: 85.5612 aux.loss_ce: 0.2474 aux.acc_seg: 86.4930 2024/10/25 09:43:23 - mmengine - INFO - Iter(train) [22700/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 7:11:22 time: 1.9441 data_time: 0.0163 memory: 6646 grad_norm: 6.2201 loss: 0.9267 decode.loss_ce: 0.6185 decode.acc_seg: 77.0908 aux.loss_ce: 0.3082 aux.acc_seg: 72.6618 2024/10/25 09:45:00 - mmengine - INFO - Iter(train) [22750/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 7:09:43 time: 1.9522 data_time: 0.0173 memory: 6646 grad_norm: 6.4467 loss: 0.8215 decode.loss_ce: 0.5617 decode.acc_seg: 86.9609 aux.loss_ce: 0.2597 aux.acc_seg: 86.1358 2024/10/25 09:46:38 - mmengine - INFO - Iter(train) [22800/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 7:08:05 time: 1.9626 data_time: 0.0162 memory: 6646 grad_norm: 5.0843 loss: 0.6911 decode.loss_ce: 0.4651 decode.acc_seg: 82.6274 aux.loss_ce: 0.2261 aux.acc_seg: 82.6525 2024/10/25 09:48:16 - mmengine - INFO - Iter(train) [22850/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 7:06:26 time: 1.9478 data_time: 0.0167 memory: 6645 grad_norm: 5.4355 loss: 0.7119 decode.loss_ce: 0.4827 decode.acc_seg: 81.4957 aux.loss_ce: 0.2292 aux.acc_seg: 80.3150 2024/10/25 09:49:53 - mmengine - INFO - Iter(train) [22900/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 7:04:46 time: 1.9428 data_time: 0.0160 memory: 6645 grad_norm: 6.9447 loss: 0.8055 decode.loss_ce: 0.5336 decode.acc_seg: 84.6549 aux.loss_ce: 0.2720 aux.acc_seg: 79.2298 2024/10/25 09:51:31 - mmengine - INFO - Iter(train) [22950/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 7:03:07 time: 1.9494 data_time: 0.0163 memory: 6645 grad_norm: 7.5193 loss: 0.8044 decode.loss_ce: 0.5394 decode.acc_seg: 81.8966 aux.loss_ce: 0.2650 aux.acc_seg: 74.4615 2024/10/25 09:53:08 - mmengine - INFO - Exp name: deeplabv3_mobilemamba_b4-80k_ade20k-512x512_20241024_212005 2024/10/25 09:53:08 - mmengine - INFO - Iter(train) [23000/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 7:01:28 time: 1.9538 data_time: 0.0164 memory: 6645 grad_norm: 5.2594 loss: 0.6803 decode.loss_ce: 0.4581 decode.acc_seg: 81.3272 aux.loss_ce: 0.2223 aux.acc_seg: 78.9484 2024/10/25 09:54:46 - mmengine - INFO - Iter(train) [23050/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 6:59:50 time: 1.9574 data_time: 0.0154 memory: 6645 grad_norm: 5.5366 loss: 0.8736 decode.loss_ce: 0.5792 decode.acc_seg: 79.1088 aux.loss_ce: 0.2944 aux.acc_seg: 77.4883 2024/10/25 09:56:23 - mmengine - INFO - Iter(train) [23100/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 6:58:11 time: 1.9456 data_time: 0.0160 memory: 6645 grad_norm: 6.0610 loss: 0.9746 decode.loss_ce: 0.6668 decode.acc_seg: 79.5582 aux.loss_ce: 0.3078 aux.acc_seg: 73.4761 2024/10/25 09:58:01 - mmengine - INFO - Iter(train) [23150/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 6:56:33 time: 1.9476 data_time: 0.0166 memory: 6646 grad_norm: 5.3385 loss: 0.8058 decode.loss_ce: 0.5349 decode.acc_seg: 85.9822 aux.loss_ce: 0.2709 aux.acc_seg: 84.3582 2024/10/25 09:59:39 - mmengine - INFO - Iter(train) [23200/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 6:54:53 time: 1.9420 data_time: 0.0176 memory: 6645 grad_norm: 8.6725 loss: 0.7274 decode.loss_ce: 0.4877 decode.acc_seg: 77.5088 aux.loss_ce: 0.2397 aux.acc_seg: 77.8147 2024/10/25 10:01:16 - mmengine - INFO - Iter(train) [23250/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 6:53:14 time: 1.9424 data_time: 0.0169 memory: 6647 grad_norm: 4.9244 loss: 0.7756 decode.loss_ce: 0.5182 decode.acc_seg: 78.1695 aux.loss_ce: 0.2575 aux.acc_seg: 75.0266 2024/10/25 10:02:54 - mmengine - INFO - Iter(train) [23300/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 6:51:35 time: 1.9466 data_time: 0.0167 memory: 6646 grad_norm: 5.9116 loss: 0.7796 decode.loss_ce: 0.5212 decode.acc_seg: 85.6869 aux.loss_ce: 0.2584 aux.acc_seg: 85.3614 2024/10/25 10:04:31 - mmengine - INFO - Iter(train) [23350/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 6:49:56 time: 1.9451 data_time: 0.0178 memory: 6646 grad_norm: 4.5283 loss: 0.8712 decode.loss_ce: 0.5922 decode.acc_seg: 80.6191 aux.loss_ce: 0.2791 aux.acc_seg: 72.6198 2024/10/25 10:06:09 - mmengine - INFO - Iter(train) [23400/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 6:48:18 time: 1.9611 data_time: 0.0161 memory: 6645 grad_norm: 5.1382 loss: 0.7308 decode.loss_ce: 0.4827 decode.acc_seg: 80.9140 aux.loss_ce: 0.2481 aux.acc_seg: 78.2839 2024/10/25 10:07:47 - mmengine - INFO - Iter(train) [23450/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 6:46:39 time: 1.9501 data_time: 0.0168 memory: 6646 grad_norm: 6.0076 loss: 0.9169 decode.loss_ce: 0.6014 decode.acc_seg: 75.6265 aux.loss_ce: 0.3155 aux.acc_seg: 66.4729 2024/10/25 10:09:25 - mmengine - INFO - Iter(train) [23500/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 6:45:00 time: 1.9569 data_time: 0.0159 memory: 6646 grad_norm: 5.8781 loss: 0.6849 decode.loss_ce: 0.4478 decode.acc_seg: 83.5442 aux.loss_ce: 0.2371 aux.acc_seg: 84.3835 2024/10/25 10:11:02 - mmengine - INFO - Iter(train) [23550/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 6:43:21 time: 1.9475 data_time: 0.0152 memory: 6646 grad_norm: 6.0348 loss: 0.6792 decode.loss_ce: 0.4534 decode.acc_seg: 79.6139 aux.loss_ce: 0.2258 aux.acc_seg: 76.9615 2024/10/25 10:12:42 - mmengine - INFO - Iter(train) [23600/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 6:41:48 time: 1.9486 data_time: 0.0173 memory: 6646 grad_norm: 6.3904 loss: 0.8078 decode.loss_ce: 0.5448 decode.acc_seg: 76.0328 aux.loss_ce: 0.2631 aux.acc_seg: 77.1308 2024/10/25 10:14:20 - mmengine - INFO - Iter(train) [23650/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 6:40:09 time: 1.9479 data_time: 0.0160 memory: 6645 grad_norm: 5.6776 loss: 0.7304 decode.loss_ce: 0.4978 decode.acc_seg: 70.3792 aux.loss_ce: 0.2326 aux.acc_seg: 67.4343 2024/10/25 10:15:57 - mmengine - INFO - Iter(train) [23700/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 6:38:31 time: 1.9466 data_time: 0.0171 memory: 6644 grad_norm: 6.4898 loss: 0.8201 decode.loss_ce: 0.5614 decode.acc_seg: 74.7978 aux.loss_ce: 0.2587 aux.acc_seg: 75.2927 2024/10/25 10:17:35 - mmengine - INFO - Iter(train) [23750/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 6:36:52 time: 1.9435 data_time: 0.0165 memory: 6646 grad_norm: 5.8874 loss: 0.7464 decode.loss_ce: 0.4979 decode.acc_seg: 88.0185 aux.loss_ce: 0.2485 aux.acc_seg: 85.2801 2024/10/25 10:19:12 - mmengine - INFO - Iter(train) [23800/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 6:35:13 time: 1.9420 data_time: 0.0158 memory: 6645 grad_norm: 4.8357 loss: 0.7142 decode.loss_ce: 0.4734 decode.acc_seg: 79.2966 aux.loss_ce: 0.2408 aux.acc_seg: 72.3933 2024/10/25 10:20:50 - mmengine - INFO - Iter(train) [23850/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 6:33:33 time: 1.9445 data_time: 0.0162 memory: 6645 grad_norm: 5.8657 loss: 0.7793 decode.loss_ce: 0.5194 decode.acc_seg: 83.0548 aux.loss_ce: 0.2599 aux.acc_seg: 71.9731 2024/10/25 10:22:27 - mmengine - INFO - Iter(train) [23900/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 6:31:53 time: 1.9467 data_time: 0.0162 memory: 6646 grad_norm: 6.3195 loss: 0.7191 decode.loss_ce: 0.4904 decode.acc_seg: 84.0628 aux.loss_ce: 0.2287 aux.acc_seg: 86.4303 2024/10/25 10:24:04 - mmengine - INFO - Iter(train) [23950/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 6:30:14 time: 1.9466 data_time: 0.0163 memory: 6646 grad_norm: 5.2036 loss: 0.7725 decode.loss_ce: 0.5163 decode.acc_seg: 74.7496 aux.loss_ce: 0.2562 aux.acc_seg: 74.3381 2024/10/25 10:25:43 - mmengine - INFO - Exp name: deeplabv3_mobilemamba_b4-80k_ade20k-512x512_20241024_212005 2024/10/25 10:25:43 - mmengine - INFO - Iter(train) [24000/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 6:28:37 time: 1.9512 data_time: 0.0169 memory: 6645 grad_norm: 5.7021 loss: 0.8314 decode.loss_ce: 0.5636 decode.acc_seg: 78.7034 aux.loss_ce: 0.2678 aux.acc_seg: 70.3930 2024/10/25 10:25:43 - mmengine - INFO - Saving checkpoint at 24000 iterations 2024/10/25 10:25:48 - mmengine - INFO - Iter(val) [ 50/500] eta: 0:00:15 time: 0.0316 data_time: 0.0015 memory: 1049 2024/10/25 10:25:50 - mmengine - INFO - Iter(val) [100/500] eta: 0:00:13 time: 0.0314 data_time: 0.0015 memory: 1117 2024/10/25 10:25:51 - mmengine - INFO - Iter(val) [150/500] eta: 0:00:11 time: 0.0321 data_time: 0.0016 memory: 833 2024/10/25 10:25:53 - mmengine - INFO - Iter(val) [200/500] eta: 0:00:09 time: 0.0363 data_time: 0.0020 memory: 866 2024/10/25 10:25:54 - mmengine - INFO - Iter(val) [250/500] eta: 0:00:08 time: 0.0339 data_time: 0.0019 memory: 906 2024/10/25 10:25:56 - mmengine - INFO - Iter(val) [300/500] eta: 0:00:06 time: 0.0339 data_time: 0.0021 memory: 2028 2024/10/25 10:25:58 - mmengine - INFO - Iter(val) [350/500] eta: 0:00:04 time: 0.0332 data_time: 0.0017 memory: 832 2024/10/25 10:26:00 - mmengine - INFO - Iter(val) [400/500] eta: 0:00:03 time: 0.0338 data_time: 0.0017 memory: 904 2024/10/25 10:26:01 - mmengine - INFO - Iter(val) [450/500] eta: 0:00:01 time: 0.0324 data_time: 0.0015 memory: 839 2024/10/25 10:26:03 - mmengine - INFO - Iter(val) [500/500] eta: 0:00:00 time: 0.0328 data_time: 0.0016 memory: 889 2024/10/25 10:26:06 - mmengine - INFO - per class results: 2024/10/25 10:26:06 - mmengine - INFO - +---------------------+-------+-------+ | Class | IoU | Acc | +---------------------+-------+-------+ | wall | 65.77 | 81.96 | | building | 75.36 | 88.56 | | sky | 88.47 | 93.7 | | floor | 69.17 | 83.62 | | tree | 65.03 | 82.73 | | ceiling | 75.54 | 87.18 | | road | 76.12 | 86.4 | | bed | 77.45 | 87.03 | | windowpane | 49.59 | 67.47 | | grass | 58.38 | 82.05 | | cabinet | 48.83 | 61.56 | | sidewalk | 53.58 | 66.4 | | person | 58.68 | 77.6 | | earth | 25.65 | 36.54 | | door | 30.78 | 43.58 | | table | 40.95 | 57.99 | | mountain | 53.15 | 67.62 | | plant | 41.57 | 53.9 | | curtain | 55.67 | 73.09 | | chair | 37.84 | 52.85 | | car | 66.16 | 81.35 | | water | 35.19 | 47.63 | | painting | 50.31 | 65.78 | | sofa | 51.46 | 79.84 | | shelf | 24.1 | 32.0 | | house | 36.95 | 52.19 | | sea | 43.98 | 67.81 | | mirror | 49.43 | 56.34 | | rug | 48.14 | 69.51 | | field | 24.92 | 40.76 | | armchair | 26.11 | 35.14 | | seat | 48.02 | 64.32 | | fence | 36.13 | 50.65 | | desk | 32.39 | 52.94 | | rock | 29.21 | 40.37 | | wardrobe | 34.45 | 58.05 | | lamp | 28.4 | 37.2 | | bathtub | 65.85 | 72.17 | | railing | 26.92 | 36.5 | | cushion | 31.2 | 42.46 | | base | 14.54 | 21.63 | | box | 13.12 | 19.59 | | column | 32.23 | 47.4 | | signboard | 19.49 | 25.84 | | chest of drawers | 27.87 | 45.66 | | counter | 15.31 | 18.86 | | sand | 22.89 | 43.92 | | sink | 47.83 | 58.33 | | skyscraper | 24.98 | 31.54 | | fireplace | 62.82 | 76.8 | | refrigerator | 53.22 | 59.81 | | grandstand | 44.27 | 70.34 | | path | 16.93 | 27.14 | | stairs | 23.75 | 30.28 | | runway | 66.92 | 92.31 | | case | 52.82 | 78.07 | | pool table | 76.02 | 83.94 | | pillow | 36.18 | 43.28 | | screen door | 56.54 | 80.72 | | stairway | 23.84 | 30.43 | | river | 9.59 | 28.14 | | bridge | 58.38 | 66.47 | | bookcase | 20.09 | 31.29 | | blind | 24.05 | 26.18 | | coffee table | 40.56 | 63.03 | | toilet | 59.26 | 67.63 | | flower | 22.64 | 36.47 | | book | 32.26 | 52.75 | | hill | 3.14 | 5.3 | | bench | 26.72 | 39.47 | | countertop | 39.35 | 51.67 | | stove | 54.04 | 60.45 | | palm | 32.18 | 61.35 | | kitchen island | 26.39 | 49.59 | | computer | 52.08 | 66.59 | | swivel chair | 34.95 | 45.54 | | boat | 33.95 | 72.32 | | bar | 33.49 | 44.06 | | arcade machine | 54.57 | 65.1 | | hovel | 29.16 | 58.87 | | bus | 57.08 | 62.76 | | towel | 33.48 | 49.11 | | light | 8.34 | 9.15 | | truck | 24.16 | 34.07 | | tower | 20.05 | 71.28 | | chandelier | 45.9 | 73.92 | | awning | 14.57 | 17.29 | | streetlight | 3.16 | 3.56 | | booth | 48.88 | 60.09 | | television receiver | 44.72 | 62.78 | | airplane | 29.76 | 44.62 | | dirt track | 4.43 | 33.61 | | apparel | 17.95 | 32.84 | | pole | 4.98 | 5.7 | | land | 4.58 | 9.94 | | bannister | 0.19 | 0.22 | | escalator | 43.95 | 65.57 | | ottoman | 23.13 | 38.18 | | bottle | 18.14 | 29.21 | | buffet | 39.02 | 52.54 | | poster | 18.08 | 21.42 | | stage | 8.61 | 15.49 | | van | 16.72 | 18.58 | | ship | 15.27 | 18.99 | | fountain | 1.23 | 1.36 | | conveyer belt | 51.72 | 72.47 | | canopy | 17.53 | 20.63 | | washer | 53.47 | 63.21 | | plaything | 8.42 | 13.82 | | swimming pool | 50.46 | 65.19 | | stool | 17.15 | 25.91 | | barrel | 27.8 | 65.02 | | basket | 12.49 | 16.54 | | waterfall | 51.43 | 70.14 | | tent | 78.14 | 88.87 | | bag | 2.84 | 3.13 | | minibike | 50.23 | 67.41 | | cradle | 51.83 | 77.1 | | oven | 30.32 | 47.53 | | ball | 0.73 | 0.82 | | food | 33.77 | 50.0 | | step | 10.78 | 13.62 | | tank | 30.74 | 39.66 | | trade name | 7.86 | 8.29 | | microwave | 26.81 | 30.87 | | pot | 14.34 | 16.26 | | animal | 34.69 | 42.49 | | bicycle | 32.17 | 51.97 | | lake | 2.07 | 3.42 | | dishwasher | 41.65 | 55.7 | | screen | 59.15 | 71.19 | | blanket | 1.05 | 1.26 | | sculpture | 31.08 | 46.49 | | hood | 32.62 | 34.78 | | sconce | 5.47 | 5.72 | | vase | 13.12 | 17.18 | | traffic light | 9.39 | 17.0 | | tray | 0.64 | 0.75 | | ashcan | 20.33 | 24.8 | | fan | 23.5 | 30.82 | | pier | 23.73 | 35.8 | | crt screen | 0.14 | 0.31 | | plate | 19.17 | 24.97 | | monitor | 26.22 | 29.33 | | bulletin board | 30.87 | 34.28 | | shower | 0.05 | 0.05 | | radiator | 32.33 | 37.21 | | glass | 0.69 | 0.71 | | clock | 4.85 | 5.4 | | flag | 14.9 | 20.11 | +---------------------+-------+-------+ 2024/10/25 10:26:06 - mmengine - INFO - Iter(val) [500/500] aAcc: 74.4800 mIoU: 33.2700 mAcc: 45.1600 data_time: 0.0017 time: 0.0331 2024/10/25 10:27:44 - mmengine - INFO - Iter(train) [24050/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 6:27:06 time: 1.9540 data_time: 0.0171 memory: 6645 grad_norm: 5.3883 loss: 0.7339 decode.loss_ce: 0.4783 decode.acc_seg: 83.8259 aux.loss_ce: 0.2556 aux.acc_seg: 85.7348 2024/10/25 10:29:21 - mmengine - INFO - Iter(train) [24100/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 6:25:27 time: 1.9441 data_time: 0.0153 memory: 6646 grad_norm: 5.5913 loss: 0.7103 decode.loss_ce: 0.4791 decode.acc_seg: 84.5068 aux.loss_ce: 0.2312 aux.acc_seg: 78.0287 2024/10/25 10:30:59 - mmengine - INFO - Iter(train) [24150/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 6:23:48 time: 1.9442 data_time: 0.0163 memory: 6644 grad_norm: 4.1597 loss: 0.6026 decode.loss_ce: 0.3931 decode.acc_seg: 82.4927 aux.loss_ce: 0.2095 aux.acc_seg: 77.7021 2024/10/25 10:32:36 - mmengine - INFO - Iter(train) [24200/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 6:22:08 time: 1.9487 data_time: 0.0169 memory: 6646 grad_norm: 7.7905 loss: 0.6963 decode.loss_ce: 0.4661 decode.acc_seg: 82.0646 aux.loss_ce: 0.2302 aux.acc_seg: 75.7722 2024/10/25 10:34:14 - mmengine - INFO - Iter(train) [24250/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 6:20:29 time: 1.9463 data_time: 0.0171 memory: 6645 grad_norm: 4.0745 loss: 0.8355 decode.loss_ce: 0.5530 decode.acc_seg: 83.7764 aux.loss_ce: 0.2825 aux.acc_seg: 79.2881 2024/10/25 10:35:51 - mmengine - INFO - Iter(train) [24300/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 6:18:51 time: 1.9545 data_time: 0.0172 memory: 6645 grad_norm: 5.9172 loss: 0.7182 decode.loss_ce: 0.4997 decode.acc_seg: 77.6182 aux.loss_ce: 0.2185 aux.acc_seg: 74.5046 2024/10/25 10:37:29 - mmengine - INFO - Iter(train) [24350/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 6:17:12 time: 1.9497 data_time: 0.0158 memory: 6645 grad_norm: 5.7249 loss: 0.7824 decode.loss_ce: 0.5342 decode.acc_seg: 85.9743 aux.loss_ce: 0.2481 aux.acc_seg: 86.6981 2024/10/25 10:39:07 - mmengine - INFO - Iter(train) [24400/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 6:15:34 time: 1.9480 data_time: 0.0159 memory: 6645 grad_norm: 7.6652 loss: 0.6859 decode.loss_ce: 0.4591 decode.acc_seg: 83.6825 aux.loss_ce: 0.2268 aux.acc_seg: 80.2113 2024/10/25 10:40:44 - mmengine - INFO - Iter(train) [24450/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 6:13:55 time: 1.9468 data_time: 0.0171 memory: 6645 grad_norm: 5.5865 loss: 0.6642 decode.loss_ce: 0.4412 decode.acc_seg: 82.3076 aux.loss_ce: 0.2230 aux.acc_seg: 77.2159 2024/10/25 10:42:22 - mmengine - INFO - Iter(train) [24500/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 6:12:16 time: 1.9420 data_time: 0.0170 memory: 6645 grad_norm: 4.7034 loss: 0.7809 decode.loss_ce: 0.5273 decode.acc_seg: 83.0109 aux.loss_ce: 0.2536 aux.acc_seg: 77.0688 2024/10/25 10:44:00 - mmengine - INFO - Iter(train) [24550/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 6:10:38 time: 1.9483 data_time: 0.0161 memory: 6646 grad_norm: 7.4024 loss: 0.7139 decode.loss_ce: 0.4769 decode.acc_seg: 77.8123 aux.loss_ce: 0.2371 aux.acc_seg: 77.2947 2024/10/25 10:45:38 - mmengine - INFO - Iter(train) [24600/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 6:08:59 time: 1.9490 data_time: 0.0167 memory: 6645 grad_norm: 7.5014 loss: 0.6652 decode.loss_ce: 0.4601 decode.acc_seg: 87.1175 aux.loss_ce: 0.2051 aux.acc_seg: 84.1709 2024/10/25 10:47:15 - mmengine - INFO - Iter(train) [24650/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 6:07:20 time: 1.9523 data_time: 0.0172 memory: 6646 grad_norm: 4.0450 loss: 0.6907 decode.loss_ce: 0.4666 decode.acc_seg: 76.0057 aux.loss_ce: 0.2241 aux.acc_seg: 72.0608 2024/10/25 10:48:53 - mmengine - INFO - Iter(train) [24700/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 6:05:43 time: 1.9548 data_time: 0.0160 memory: 6645 grad_norm: 6.9651 loss: 0.7452 decode.loss_ce: 0.4928 decode.acc_seg: 78.2078 aux.loss_ce: 0.2524 aux.acc_seg: 75.3670 2024/10/25 10:50:31 - mmengine - INFO - Iter(train) [24750/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 6:04:04 time: 1.9470 data_time: 0.0168 memory: 6645 grad_norm: 5.8543 loss: 0.6564 decode.loss_ce: 0.4460 decode.acc_seg: 87.4365 aux.loss_ce: 0.2104 aux.acc_seg: 87.6245 2024/10/25 10:52:08 - mmengine - INFO - Iter(train) [24800/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 6:02:25 time: 1.9491 data_time: 0.0165 memory: 6646 grad_norm: 5.4997 loss: 0.7029 decode.loss_ce: 0.4728 decode.acc_seg: 85.5365 aux.loss_ce: 0.2301 aux.acc_seg: 81.6785 2024/10/25 10:53:46 - mmengine - INFO - Iter(train) [24850/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 6:00:46 time: 1.9519 data_time: 0.0156 memory: 6645 grad_norm: 6.2369 loss: 0.6640 decode.loss_ce: 0.4445 decode.acc_seg: 84.0254 aux.loss_ce: 0.2195 aux.acc_seg: 77.5005 2024/10/25 10:55:24 - mmengine - INFO - Iter(train) [24900/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 5:59:08 time: 1.9525 data_time: 0.0161 memory: 6645 grad_norm: 6.3460 loss: 0.7727 decode.loss_ce: 0.4929 decode.acc_seg: 80.7079 aux.loss_ce: 0.2798 aux.acc_seg: 76.5950 2024/10/25 10:57:01 - mmengine - INFO - Iter(train) [24950/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 5:57:30 time: 1.9530 data_time: 0.0157 memory: 6646 grad_norm: 5.9939 loss: 0.8574 decode.loss_ce: 0.5761 decode.acc_seg: 80.8191 aux.loss_ce: 0.2814 aux.acc_seg: 77.1083 2024/10/25 10:58:42 - mmengine - INFO - Exp name: deeplabv3_mobilemamba_b4-80k_ade20k-512x512_20241024_212005 2024/10/25 10:58:42 - mmengine - INFO - Iter(train) [25000/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 5:55:58 time: 1.9511 data_time: 0.0171 memory: 6646 grad_norm: 5.4036 loss: 0.7152 decode.loss_ce: 0.4933 decode.acc_seg: 83.1348 aux.loss_ce: 0.2219 aux.acc_seg: 78.5879 2024/10/25 11:00:20 - mmengine - INFO - Iter(train) [25050/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 5:54:19 time: 1.9490 data_time: 0.0170 memory: 6645 grad_norm: 5.6255 loss: 0.7580 decode.loss_ce: 0.5109 decode.acc_seg: 77.1534 aux.loss_ce: 0.2471 aux.acc_seg: 70.1833 2024/10/25 11:01:58 - mmengine - INFO - Iter(train) [25100/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 5:52:41 time: 1.9496 data_time: 0.0167 memory: 6645 grad_norm: 5.2184 loss: 0.6594 decode.loss_ce: 0.4515 decode.acc_seg: 83.2348 aux.loss_ce: 0.2080 aux.acc_seg: 77.6095 2024/10/25 11:03:35 - mmengine - INFO - Iter(train) [25150/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 5:51:02 time: 1.9478 data_time: 0.0166 memory: 6647 grad_norm: 6.8439 loss: 0.7508 decode.loss_ce: 0.5017 decode.acc_seg: 78.0076 aux.loss_ce: 0.2491 aux.acc_seg: 72.6938 2024/10/25 11:05:13 - mmengine - INFO - Iter(train) [25200/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 5:49:24 time: 1.9589 data_time: 0.0176 memory: 6645 grad_norm: 4.7341 loss: 0.6446 decode.loss_ce: 0.4327 decode.acc_seg: 81.6807 aux.loss_ce: 0.2119 aux.acc_seg: 82.7858 2024/10/25 11:06:51 - mmengine - INFO - Iter(train) [25250/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 5:47:45 time: 1.9548 data_time: 0.0162 memory: 6646 grad_norm: 6.4291 loss: 0.7664 decode.loss_ce: 0.5283 decode.acc_seg: 83.6767 aux.loss_ce: 0.2382 aux.acc_seg: 73.1292 2024/10/25 11:08:29 - mmengine - INFO - Iter(train) [25300/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 5:46:07 time: 1.9606 data_time: 0.0155 memory: 6646 grad_norm: 9.2324 loss: 0.7951 decode.loss_ce: 0.5317 decode.acc_seg: 78.4969 aux.loss_ce: 0.2634 aux.acc_seg: 74.0416 2024/10/25 11:10:06 - mmengine - INFO - Iter(train) [25350/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 5:44:28 time: 1.9502 data_time: 0.0165 memory: 6647 grad_norm: 4.8957 loss: 0.7070 decode.loss_ce: 0.4699 decode.acc_seg: 78.2300 aux.loss_ce: 0.2371 aux.acc_seg: 72.9061 2024/10/25 11:11:44 - mmengine - INFO - Iter(train) [25400/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 5:42:50 time: 1.9497 data_time: 0.0153 memory: 6647 grad_norm: 5.7104 loss: 0.7166 decode.loss_ce: 0.4886 decode.acc_seg: 84.1453 aux.loss_ce: 0.2280 aux.acc_seg: 77.5928 2024/10/25 11:13:22 - mmengine - INFO - Iter(train) [25450/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 5:41:12 time: 1.9471 data_time: 0.0167 memory: 6647 grad_norm: 6.1291 loss: 0.6762 decode.loss_ce: 0.4471 decode.acc_seg: 81.4761 aux.loss_ce: 0.2291 aux.acc_seg: 79.5557 2024/10/25 11:15:00 - mmengine - INFO - Iter(train) [25500/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 5:39:34 time: 1.9547 data_time: 0.0177 memory: 6646 grad_norm: 4.4931 loss: 0.6792 decode.loss_ce: 0.4614 decode.acc_seg: 76.8514 aux.loss_ce: 0.2178 aux.acc_seg: 68.7316 2024/10/25 11:16:38 - mmengine - INFO - Iter(train) [25550/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 5:37:55 time: 1.9661 data_time: 0.0157 memory: 6645 grad_norm: 5.2347 loss: 0.7411 decode.loss_ce: 0.5023 decode.acc_seg: 75.6140 aux.loss_ce: 0.2388 aux.acc_seg: 78.5151 2024/10/25 11:18:16 - mmengine - INFO - Iter(train) [25600/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 5:36:17 time: 1.9523 data_time: 0.0177 memory: 6646 grad_norm: 5.2809 loss: 0.7556 decode.loss_ce: 0.5081 decode.acc_seg: 84.0945 aux.loss_ce: 0.2475 aux.acc_seg: 84.1032 2024/10/25 11:19:53 - mmengine - INFO - Iter(train) [25650/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 5:34:38 time: 1.9425 data_time: 0.0170 memory: 6646 grad_norm: 4.7761 loss: 0.7257 decode.loss_ce: 0.4901 decode.acc_seg: 83.5718 aux.loss_ce: 0.2356 aux.acc_seg: 81.1442 2024/10/25 11:21:30 - mmengine - INFO - Iter(train) [25700/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 5:32:59 time: 1.9432 data_time: 0.0168 memory: 6647 grad_norm: 5.6072 loss: 0.7830 decode.loss_ce: 0.5279 decode.acc_seg: 80.3565 aux.loss_ce: 0.2551 aux.acc_seg: 73.3894 2024/10/25 11:23:08 - mmengine - INFO - Iter(train) [25750/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 5:31:20 time: 1.9482 data_time: 0.0176 memory: 6645 grad_norm: 6.0012 loss: 0.8858 decode.loss_ce: 0.5990 decode.acc_seg: 81.2095 aux.loss_ce: 0.2869 aux.acc_seg: 75.6890 2024/10/25 11:24:45 - mmengine - INFO - Iter(train) [25800/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 5:29:41 time: 1.9444 data_time: 0.0169 memory: 6646 grad_norm: 4.6505 loss: 0.7371 decode.loss_ce: 0.5018 decode.acc_seg: 87.8429 aux.loss_ce: 0.2353 aux.acc_seg: 82.9522 2024/10/25 11:26:23 - mmengine - INFO - Iter(train) [25850/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 5:28:03 time: 1.9733 data_time: 0.0153 memory: 6647 grad_norm: 5.6878 loss: 0.7479 decode.loss_ce: 0.5056 decode.acc_seg: 78.5809 aux.loss_ce: 0.2423 aux.acc_seg: 79.4709 2024/10/25 11:28:01 - mmengine - INFO - Iter(train) [25900/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 5:26:25 time: 1.9549 data_time: 0.0171 memory: 6645 grad_norm: 4.5347 loss: 0.6646 decode.loss_ce: 0.4355 decode.acc_seg: 79.0529 aux.loss_ce: 0.2291 aux.acc_seg: 78.7847 2024/10/25 11:29:41 - mmengine - INFO - Iter(train) [25950/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 5:24:52 time: 1.9508 data_time: 0.0162 memory: 6646 grad_norm: 5.8274 loss: 0.7206 decode.loss_ce: 0.4784 decode.acc_seg: 88.8586 aux.loss_ce: 0.2423 aux.acc_seg: 89.5265 2024/10/25 11:31:19 - mmengine - INFO - Exp name: deeplabv3_mobilemamba_b4-80k_ade20k-512x512_20241024_212005 2024/10/25 11:31:19 - mmengine - INFO - Iter(train) [26000/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 5:23:13 time: 1.9539 data_time: 0.0154 memory: 6645 grad_norm: 7.7780 loss: 0.6359 decode.loss_ce: 0.4290 decode.acc_seg: 86.1022 aux.loss_ce: 0.2068 aux.acc_seg: 81.5343 2024/10/25 11:32:57 - mmengine - INFO - Iter(train) [26050/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 5:21:35 time: 1.9417 data_time: 0.0166 memory: 6646 grad_norm: 5.0236 loss: 0.7121 decode.loss_ce: 0.4788 decode.acc_seg: 77.0992 aux.loss_ce: 0.2333 aux.acc_seg: 78.6591 2024/10/25 11:34:34 - mmengine - INFO - Iter(train) [26100/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 5:19:56 time: 1.9458 data_time: 0.0164 memory: 6645 grad_norm: 4.7174 loss: 0.7549 decode.loss_ce: 0.4988 decode.acc_seg: 70.8184 aux.loss_ce: 0.2561 aux.acc_seg: 74.8673 2024/10/25 11:36:12 - mmengine - INFO - Iter(train) [26150/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 5:18:18 time: 1.9572 data_time: 0.0164 memory: 6645 grad_norm: 7.4930 loss: 0.7164 decode.loss_ce: 0.4860 decode.acc_seg: 82.9440 aux.loss_ce: 0.2304 aux.acc_seg: 79.6489 2024/10/25 11:37:50 - mmengine - INFO - Iter(train) [26200/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 5:16:39 time: 1.9570 data_time: 0.0163 memory: 6645 grad_norm: 4.6084 loss: 0.7168 decode.loss_ce: 0.4817 decode.acc_seg: 90.9473 aux.loss_ce: 0.2350 aux.acc_seg: 90.5151 2024/10/25 11:39:27 - mmengine - INFO - Iter(train) [26250/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 5:15:01 time: 1.9557 data_time: 0.0164 memory: 6645 grad_norm: 4.3199 loss: 0.7115 decode.loss_ce: 0.4812 decode.acc_seg: 68.0871 aux.loss_ce: 0.2303 aux.acc_seg: 66.3685 2024/10/25 11:41:05 - mmengine - INFO - Iter(train) [26300/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 5:13:22 time: 1.9500 data_time: 0.0172 memory: 6646 grad_norm: 7.2924 loss: 0.7706 decode.loss_ce: 0.5203 decode.acc_seg: 74.4136 aux.loss_ce: 0.2503 aux.acc_seg: 68.3474 2024/10/25 11:42:43 - mmengine - INFO - Iter(train) [26350/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 5:11:43 time: 1.9648 data_time: 0.0162 memory: 6646 grad_norm: 5.6449 loss: 0.6697 decode.loss_ce: 0.4579 decode.acc_seg: 81.1173 aux.loss_ce: 0.2118 aux.acc_seg: 77.5060 2024/10/25 11:44:21 - mmengine - INFO - Iter(train) [26400/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 5:10:05 time: 1.9464 data_time: 0.0173 memory: 6645 grad_norm: 5.6126 loss: 0.6738 decode.loss_ce: 0.4455 decode.acc_seg: 80.9138 aux.loss_ce: 0.2283 aux.acc_seg: 75.5709 2024/10/25 11:45:58 - mmengine - INFO - Iter(train) [26450/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 5:08:27 time: 1.9623 data_time: 0.0173 memory: 6647 grad_norm: 5.5810 loss: 0.6758 decode.loss_ce: 0.4596 decode.acc_seg: 81.4756 aux.loss_ce: 0.2162 aux.acc_seg: 81.1353 2024/10/25 11:47:36 - mmengine - INFO - Iter(train) [26500/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 5:06:49 time: 1.9706 data_time: 0.0158 memory: 6646 grad_norm: 7.5062 loss: 0.8516 decode.loss_ce: 0.5838 decode.acc_seg: 81.8288 aux.loss_ce: 0.2678 aux.acc_seg: 80.2041 2024/10/25 11:49:14 - mmengine - INFO - Iter(train) [26550/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 5:05:10 time: 1.9541 data_time: 0.0164 memory: 6645 grad_norm: 5.5309 loss: 0.6842 decode.loss_ce: 0.4591 decode.acc_seg: 89.7106 aux.loss_ce: 0.2251 aux.acc_seg: 84.6193 2024/10/25 11:50:51 - mmengine - INFO - Iter(train) [26600/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 5:03:31 time: 1.9493 data_time: 0.0167 memory: 6646 grad_norm: 6.1756 loss: 0.7027 decode.loss_ce: 0.4657 decode.acc_seg: 85.6895 aux.loss_ce: 0.2370 aux.acc_seg: 91.4769 2024/10/25 11:52:29 - mmengine - INFO - Iter(train) [26650/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 5:01:53 time: 1.9516 data_time: 0.0175 memory: 6645 grad_norm: 5.6940 loss: 0.7271 decode.loss_ce: 0.4794 decode.acc_seg: 83.6618 aux.loss_ce: 0.2477 aux.acc_seg: 68.0311 2024/10/25 11:54:07 - mmengine - INFO - Iter(train) [26700/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 5:00:15 time: 1.9560 data_time: 0.0169 memory: 6646 grad_norm: 5.1233 loss: 0.6896 decode.loss_ce: 0.4532 decode.acc_seg: 82.7445 aux.loss_ce: 0.2364 aux.acc_seg: 78.4579 2024/10/25 11:55:45 - mmengine - INFO - Iter(train) [26750/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 4:58:36 time: 1.9498 data_time: 0.0163 memory: 6645 grad_norm: 5.1690 loss: 0.6347 decode.loss_ce: 0.4249 decode.acc_seg: 80.2337 aux.loss_ce: 0.2098 aux.acc_seg: 79.1057 2024/10/25 11:57:23 - mmengine - INFO - Iter(train) [26800/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 4:56:59 time: 1.9884 data_time: 0.0179 memory: 6646 grad_norm: 6.1853 loss: 0.7547 decode.loss_ce: 0.5039 decode.acc_seg: 77.7018 aux.loss_ce: 0.2508 aux.acc_seg: 76.3923 2024/10/25 11:59:00 - mmengine - INFO - Iter(train) [26850/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 4:55:20 time: 1.9525 data_time: 0.0170 memory: 6645 grad_norm: 5.0385 loss: 0.7250 decode.loss_ce: 0.4915 decode.acc_seg: 77.1855 aux.loss_ce: 0.2335 aux.acc_seg: 74.4043 2024/10/25 12:00:38 - mmengine - INFO - Iter(train) [26900/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 4:53:42 time: 1.9572 data_time: 0.0167 memory: 6646 grad_norm: 5.1839 loss: 0.6371 decode.loss_ce: 0.4259 decode.acc_seg: 85.5684 aux.loss_ce: 0.2112 aux.acc_seg: 79.7400 2024/10/25 12:02:16 - mmengine - INFO - Iter(train) [26950/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 4:52:03 time: 1.9478 data_time: 0.0158 memory: 6646 grad_norm: 7.8051 loss: 0.7862 decode.loss_ce: 0.5190 decode.acc_seg: 84.4574 aux.loss_ce: 0.2673 aux.acc_seg: 79.1411 2024/10/25 12:03:53 - mmengine - INFO - Exp name: deeplabv3_mobilemamba_b4-80k_ade20k-512x512_20241024_212005 2024/10/25 12:03:53 - mmengine - INFO - Iter(train) [27000/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 4:50:24 time: 1.9435 data_time: 0.0151 memory: 6645 grad_norm: 6.9372 loss: 0.7479 decode.loss_ce: 0.5181 decode.acc_seg: 82.9098 aux.loss_ce: 0.2298 aux.acc_seg: 82.4515 2024/10/25 12:05:31 - mmengine - INFO - Iter(train) [27050/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 4:48:46 time: 1.9433 data_time: 0.0161 memory: 6646 grad_norm: 4.6883 loss: 0.6559 decode.loss_ce: 0.4469 decode.acc_seg: 79.8761 aux.loss_ce: 0.2091 aux.acc_seg: 80.9848 2024/10/25 12:07:09 - mmengine - INFO - Iter(train) [27100/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 4:47:07 time: 1.9477 data_time: 0.0165 memory: 6646 grad_norm: 4.6703 loss: 0.7646 decode.loss_ce: 0.5093 decode.acc_seg: 79.2575 aux.loss_ce: 0.2553 aux.acc_seg: 73.8239 2024/10/25 12:08:47 - mmengine - INFO - Iter(train) [27150/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 4:45:29 time: 1.9584 data_time: 0.0155 memory: 6646 grad_norm: 4.7572 loss: 0.7169 decode.loss_ce: 0.4718 decode.acc_seg: 77.9906 aux.loss_ce: 0.2451 aux.acc_seg: 73.9409 2024/10/25 12:10:24 - mmengine - INFO - Iter(train) [27200/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 4:43:51 time: 1.9642 data_time: 0.0162 memory: 6646 grad_norm: 5.1421 loss: 0.6950 decode.loss_ce: 0.4680 decode.acc_seg: 75.8317 aux.loss_ce: 0.2270 aux.acc_seg: 71.3175 2024/10/25 12:12:02 - mmengine - INFO - Iter(train) [27250/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 4:42:13 time: 1.9518 data_time: 0.0168 memory: 6645 grad_norm: 5.4247 loss: 0.7364 decode.loss_ce: 0.4907 decode.acc_seg: 80.8660 aux.loss_ce: 0.2457 aux.acc_seg: 78.6551 2024/10/25 12:13:43 - mmengine - INFO - Iter(train) [27300/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 4:40:39 time: 1.9485 data_time: 0.0177 memory: 6645 grad_norm: 6.1524 loss: 0.7645 decode.loss_ce: 0.5174 decode.acc_seg: 85.6870 aux.loss_ce: 0.2472 aux.acc_seg: 84.5887 2024/10/25 12:15:20 - mmengine - INFO - Iter(train) [27350/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 4:39:01 time: 1.9542 data_time: 0.0168 memory: 6645 grad_norm: 6.0643 loss: 0.7852 decode.loss_ce: 0.5143 decode.acc_seg: 85.2266 aux.loss_ce: 0.2709 aux.acc_seg: 81.3473 2024/10/25 12:16:58 - mmengine - INFO - Iter(train) [27400/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 4:37:23 time: 1.9858 data_time: 0.0161 memory: 6645 grad_norm: 6.5661 loss: 0.8269 decode.loss_ce: 0.5629 decode.acc_seg: 66.0080 aux.loss_ce: 0.2640 aux.acc_seg: 66.2629 2024/10/25 12:18:36 - mmengine - INFO - Iter(train) [27450/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 4:35:44 time: 1.9583 data_time: 0.0165 memory: 6646 grad_norm: 6.8180 loss: 0.8364 decode.loss_ce: 0.5419 decode.acc_seg: 80.7570 aux.loss_ce: 0.2945 aux.acc_seg: 85.1725 2024/10/25 12:20:13 - mmengine - INFO - Iter(train) [27500/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 4:34:06 time: 1.9571 data_time: 0.0163 memory: 6646 grad_norm: 4.1496 loss: 0.6631 decode.loss_ce: 0.4414 decode.acc_seg: 80.9064 aux.loss_ce: 0.2217 aux.acc_seg: 68.6529 2024/10/25 12:21:51 - mmengine - INFO - Iter(train) [27550/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 4:32:27 time: 1.9464 data_time: 0.0175 memory: 6645 grad_norm: 5.0665 loss: 0.6562 decode.loss_ce: 0.4473 decode.acc_seg: 87.6156 aux.loss_ce: 0.2089 aux.acc_seg: 86.7878 2024/10/25 12:23:29 - mmengine - INFO - Iter(train) [27600/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 4:30:48 time: 1.9498 data_time: 0.0169 memory: 6646 grad_norm: 5.8900 loss: 0.7394 decode.loss_ce: 0.4969 decode.acc_seg: 81.3242 aux.loss_ce: 0.2425 aux.acc_seg: 67.9849 2024/10/25 12:25:06 - mmengine - INFO - Iter(train) [27650/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 4:29:09 time: 1.9465 data_time: 0.0173 memory: 6645 grad_norm: 6.5474 loss: 0.6632 decode.loss_ce: 0.4544 decode.acc_seg: 84.4029 aux.loss_ce: 0.2088 aux.acc_seg: 80.7438 2024/10/25 12:26:43 - mmengine - INFO - Iter(train) [27700/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 4:27:31 time: 1.9543 data_time: 0.0153 memory: 6647 grad_norm: 4.7617 loss: 0.7080 decode.loss_ce: 0.4608 decode.acc_seg: 88.5657 aux.loss_ce: 0.2471 aux.acc_seg: 80.6303 2024/10/25 12:28:21 - mmengine - INFO - Iter(train) [27750/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 4:25:52 time: 1.9474 data_time: 0.0175 memory: 6645 grad_norm: 5.9228 loss: 0.8397 decode.loss_ce: 0.5598 decode.acc_seg: 77.6493 aux.loss_ce: 0.2800 aux.acc_seg: 76.8325 2024/10/25 12:29:59 - mmengine - INFO - Iter(train) [27800/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 4:24:13 time: 1.9589 data_time: 0.0168 memory: 6645 grad_norm: 7.6062 loss: 0.6919 decode.loss_ce: 0.4725 decode.acc_seg: 86.3439 aux.loss_ce: 0.2194 aux.acc_seg: 82.5466 2024/10/25 12:31:36 - mmengine - INFO - Iter(train) [27850/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 4:22:35 time: 1.9448 data_time: 0.0169 memory: 6646 grad_norm: 3.7141 loss: 0.6398 decode.loss_ce: 0.4316 decode.acc_seg: 86.8016 aux.loss_ce: 0.2082 aux.acc_seg: 81.4586 2024/10/25 12:33:14 - mmengine - INFO - Iter(train) [27900/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 4:20:56 time: 1.9503 data_time: 0.0169 memory: 6646 grad_norm: 5.9506 loss: 0.6864 decode.loss_ce: 0.4719 decode.acc_seg: 80.7424 aux.loss_ce: 0.2145 aux.acc_seg: 78.6265 2024/10/25 12:34:52 - mmengine - INFO - Iter(train) [27950/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 4:19:18 time: 1.9550 data_time: 0.0162 memory: 6645 grad_norm: 4.4561 loss: 0.6968 decode.loss_ce: 0.4647 decode.acc_seg: 88.2003 aux.loss_ce: 0.2322 aux.acc_seg: 84.0533 2024/10/25 12:36:29 - mmengine - INFO - Exp name: deeplabv3_mobilemamba_b4-80k_ade20k-512x512_20241024_212005 2024/10/25 12:36:29 - mmengine - INFO - Iter(train) [28000/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 4:17:40 time: 1.9421 data_time: 0.0162 memory: 6645 grad_norm: 5.5336 loss: 0.6804 decode.loss_ce: 0.4477 decode.acc_seg: 85.7245 aux.loss_ce: 0.2326 aux.acc_seg: 84.6901 2024/10/25 12:38:07 - mmengine - INFO - Iter(train) [28050/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 4:16:01 time: 1.9556 data_time: 0.0169 memory: 6646 grad_norm: 7.5714 loss: 0.7035 decode.loss_ce: 0.4683 decode.acc_seg: 76.6008 aux.loss_ce: 0.2352 aux.acc_seg: 67.2973 2024/10/25 12:39:45 - mmengine - INFO - Iter(train) [28100/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 4:14:22 time: 1.9480 data_time: 0.0169 memory: 6646 grad_norm: 7.4887 loss: 0.7329 decode.loss_ce: 0.4975 decode.acc_seg: 88.7438 aux.loss_ce: 0.2355 aux.acc_seg: 87.2073 2024/10/25 12:41:23 - mmengine - INFO - Iter(train) [28150/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 4:12:45 time: 1.9698 data_time: 0.0153 memory: 6645 grad_norm: 5.5652 loss: 0.6223 decode.loss_ce: 0.4107 decode.acc_seg: 83.1757 aux.loss_ce: 0.2116 aux.acc_seg: 78.4214 2024/10/25 12:43:01 - mmengine - INFO - Iter(train) [28200/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 4:11:07 time: 1.9543 data_time: 0.0162 memory: 6645 grad_norm: 5.3145 loss: 0.7136 decode.loss_ce: 0.4622 decode.acc_seg: 79.1581 aux.loss_ce: 0.2514 aux.acc_seg: 70.3862 2024/10/25 12:44:38 - mmengine - INFO - Iter(train) [28250/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 4:09:28 time: 1.9547 data_time: 0.0164 memory: 6648 grad_norm: 4.2353 loss: 0.6910 decode.loss_ce: 0.4595 decode.acc_seg: 83.4976 aux.loss_ce: 0.2315 aux.acc_seg: 81.5164 2024/10/25 12:46:16 - mmengine - INFO - Iter(train) [28300/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 4:07:50 time: 1.9512 data_time: 0.0162 memory: 6646 grad_norm: 6.3126 loss: 0.7894 decode.loss_ce: 0.5180 decode.acc_seg: 78.9089 aux.loss_ce: 0.2713 aux.acc_seg: 68.0340 2024/10/25 12:47:54 - mmengine - INFO - Iter(train) [28350/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 4:06:12 time: 1.9557 data_time: 0.0162 memory: 6648 grad_norm: 5.5377 loss: 0.6299 decode.loss_ce: 0.4281 decode.acc_seg: 78.6610 aux.loss_ce: 0.2018 aux.acc_seg: 70.2440 2024/10/25 12:49:32 - mmengine - INFO - Iter(train) [28400/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 4:04:34 time: 1.9464 data_time: 0.0162 memory: 6645 grad_norm: 6.4214 loss: 0.7813 decode.loss_ce: 0.5164 decode.acc_seg: 69.0398 aux.loss_ce: 0.2649 aux.acc_seg: 69.0676 2024/10/25 12:51:10 - mmengine - INFO - Iter(train) [28450/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 4:02:56 time: 1.9521 data_time: 0.0173 memory: 6646 grad_norm: 5.7101 loss: 0.7657 decode.loss_ce: 0.5145 decode.acc_seg: 88.1972 aux.loss_ce: 0.2512 aux.acc_seg: 82.1167 2024/10/25 12:52:48 - mmengine - INFO - Iter(train) [28500/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 4:01:18 time: 1.9539 data_time: 0.0167 memory: 6645 grad_norm: 5.3621 loss: 0.6125 decode.loss_ce: 0.4153 decode.acc_seg: 76.5598 aux.loss_ce: 0.1971 aux.acc_seg: 77.7210 2024/10/25 12:54:25 - mmengine - INFO - Iter(train) [28550/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 3:59:40 time: 1.9445 data_time: 0.0164 memory: 6646 grad_norm: 4.9789 loss: 0.7434 decode.loss_ce: 0.4926 decode.acc_seg: 81.8116 aux.loss_ce: 0.2508 aux.acc_seg: 80.3199 2024/10/25 12:56:03 - mmengine - INFO - Iter(train) [28600/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 3:58:02 time: 1.9520 data_time: 0.0170 memory: 6645 grad_norm: 4.9909 loss: 0.7850 decode.loss_ce: 0.5179 decode.acc_seg: 80.5385 aux.loss_ce: 0.2671 aux.acc_seg: 77.3729 2024/10/25 12:57:43 - mmengine - INFO - Iter(train) [28650/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 3:56:27 time: 1.9552 data_time: 0.0185 memory: 6647 grad_norm: 5.5074 loss: 0.7081 decode.loss_ce: 0.4655 decode.acc_seg: 83.0669 aux.loss_ce: 0.2427 aux.acc_seg: 83.1313 2024/10/25 12:59:21 - mmengine - INFO - Iter(train) [28700/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 3:54:48 time: 1.9518 data_time: 0.0176 memory: 6646 grad_norm: 7.4093 loss: 0.6986 decode.loss_ce: 0.4742 decode.acc_seg: 82.3117 aux.loss_ce: 0.2244 aux.acc_seg: 80.3857 2024/10/25 13:00:58 - mmengine - INFO - Iter(train) [28750/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 3:53:10 time: 1.9743 data_time: 0.0169 memory: 6645 grad_norm: 6.9347 loss: 0.7567 decode.loss_ce: 0.5125 decode.acc_seg: 81.8995 aux.loss_ce: 0.2441 aux.acc_seg: 80.6858 2024/10/25 13:02:36 - mmengine - INFO - Iter(train) [28800/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 3:51:31 time: 1.9433 data_time: 0.0161 memory: 6646 grad_norm: 6.1934 loss: 0.7549 decode.loss_ce: 0.5122 decode.acc_seg: 84.1612 aux.loss_ce: 0.2427 aux.acc_seg: 81.8731 2024/10/25 13:04:13 - mmengine - INFO - Iter(train) [28850/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 3:49:53 time: 1.9494 data_time: 0.0168 memory: 6645 grad_norm: 6.4974 loss: 0.7568 decode.loss_ce: 0.5042 decode.acc_seg: 84.4163 aux.loss_ce: 0.2526 aux.acc_seg: 84.9265 2024/10/25 13:05:51 - mmengine - INFO - Iter(train) [28900/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 3:48:14 time: 1.9512 data_time: 0.0170 memory: 6645 grad_norm: 6.3666 loss: 0.8041 decode.loss_ce: 0.5474 decode.acc_seg: 85.4802 aux.loss_ce: 0.2568 aux.acc_seg: 81.9259 2024/10/25 13:07:29 - mmengine - INFO - Iter(train) [28950/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 3:46:36 time: 1.9425 data_time: 0.0168 memory: 6645 grad_norm: 10.3074 loss: 0.8451 decode.loss_ce: 0.5827 decode.acc_seg: 86.2365 aux.loss_ce: 0.2623 aux.acc_seg: 83.1798 2024/10/25 13:09:06 - mmengine - INFO - Exp name: deeplabv3_mobilemamba_b4-80k_ade20k-512x512_20241024_212005 2024/10/25 13:09:06 - mmengine - INFO - Iter(train) [29000/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 3:44:57 time: 1.9582 data_time: 0.0162 memory: 6645 grad_norm: 4.9870 loss: 0.6736 decode.loss_ce: 0.4500 decode.acc_seg: 85.8764 aux.loss_ce: 0.2236 aux.acc_seg: 83.6527 2024/10/25 13:10:44 - mmengine - INFO - Iter(train) [29050/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 3:43:18 time: 1.9474 data_time: 0.0165 memory: 6645 grad_norm: 5.6080 loss: 0.9066 decode.loss_ce: 0.6139 decode.acc_seg: 72.6487 aux.loss_ce: 0.2928 aux.acc_seg: 69.3344 2024/10/25 13:12:21 - mmengine - INFO - Iter(train) [29100/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 3:41:39 time: 1.9521 data_time: 0.0169 memory: 6646 grad_norm: 8.1534 loss: 0.7450 decode.loss_ce: 0.5070 decode.acc_seg: 81.8286 aux.loss_ce: 0.2380 aux.acc_seg: 74.8168 2024/10/25 13:13:59 - mmengine - INFO - Iter(train) [29150/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 3:40:01 time: 1.9584 data_time: 0.0150 memory: 6646 grad_norm: 6.2888 loss: 0.6879 decode.loss_ce: 0.4640 decode.acc_seg: 77.2000 aux.loss_ce: 0.2239 aux.acc_seg: 71.5560 2024/10/25 13:15:36 - mmengine - INFO - Iter(train) [29200/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 3:38:22 time: 1.9418 data_time: 0.0169 memory: 6645 grad_norm: 5.7496 loss: 0.7559 decode.loss_ce: 0.5080 decode.acc_seg: 75.2177 aux.loss_ce: 0.2479 aux.acc_seg: 68.6040 2024/10/25 13:17:14 - mmengine - INFO - Iter(train) [29250/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 3:36:44 time: 1.9519 data_time: 0.0167 memory: 6645 grad_norm: 5.2042 loss: 0.6972 decode.loss_ce: 0.4689 decode.acc_seg: 76.7798 aux.loss_ce: 0.2284 aux.acc_seg: 73.2306 2024/10/25 13:18:51 - mmengine - INFO - Iter(train) [29300/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 3:35:05 time: 1.9536 data_time: 0.0172 memory: 6646 grad_norm: 7.0340 loss: 0.6389 decode.loss_ce: 0.4301 decode.acc_seg: 77.7734 aux.loss_ce: 0.2088 aux.acc_seg: 78.6911 2024/10/25 13:20:29 - mmengine - INFO - Iter(train) [29350/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 3:33:26 time: 1.9451 data_time: 0.0158 memory: 6646 grad_norm: 6.1190 loss: 0.7558 decode.loss_ce: 0.4924 decode.acc_seg: 88.6149 aux.loss_ce: 0.2634 aux.acc_seg: 83.9243 2024/10/25 13:22:07 - mmengine - INFO - Iter(train) [29400/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 3:31:48 time: 1.9628 data_time: 0.0159 memory: 6645 grad_norm: 5.7886 loss: 0.7619 decode.loss_ce: 0.4974 decode.acc_seg: 82.7582 aux.loss_ce: 0.2644 aux.acc_seg: 82.3376 2024/10/25 13:23:44 - mmengine - INFO - Iter(train) [29450/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 3:30:10 time: 1.9533 data_time: 0.0161 memory: 6645 grad_norm: 4.5313 loss: 0.6789 decode.loss_ce: 0.4641 decode.acc_seg: 79.4241 aux.loss_ce: 0.2148 aux.acc_seg: 81.2864 2024/10/25 13:25:22 - mmengine - INFO - Iter(train) [29500/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 3:28:31 time: 1.9445 data_time: 0.0155 memory: 6646 grad_norm: 6.3003 loss: 0.6741 decode.loss_ce: 0.4539 decode.acc_seg: 77.6390 aux.loss_ce: 0.2202 aux.acc_seg: 74.3972 2024/10/25 13:27:00 - mmengine - INFO - Iter(train) [29550/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 3:26:53 time: 1.9443 data_time: 0.0168 memory: 6645 grad_norm: 6.7776 loss: 0.7207 decode.loss_ce: 0.4914 decode.acc_seg: 87.3276 aux.loss_ce: 0.2293 aux.acc_seg: 84.2571 2024/10/25 13:28:37 - mmengine - INFO - Iter(train) [29600/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 3:25:14 time: 1.9494 data_time: 0.0162 memory: 6645 grad_norm: 4.4582 loss: 0.6036 decode.loss_ce: 0.4062 decode.acc_seg: 77.4591 aux.loss_ce: 0.1974 aux.acc_seg: 73.6884 2024/10/25 13:30:15 - mmengine - INFO - Iter(train) [29650/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 3:23:36 time: 1.9487 data_time: 0.0158 memory: 6645 grad_norm: 6.1145 loss: 0.8051 decode.loss_ce: 0.5573 decode.acc_seg: 75.8192 aux.loss_ce: 0.2478 aux.acc_seg: 75.1852 2024/10/25 13:31:53 - mmengine - INFO - Iter(train) [29700/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 3:21:58 time: 1.9491 data_time: 0.0163 memory: 6646 grad_norm: 6.3820 loss: 0.8197 decode.loss_ce: 0.5634 decode.acc_seg: 75.6879 aux.loss_ce: 0.2563 aux.acc_seg: 71.6367 2024/10/25 13:33:30 - mmengine - INFO - Iter(train) [29750/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 3:20:19 time: 1.9690 data_time: 0.0158 memory: 6645 grad_norm: 5.8017 loss: 0.8466 decode.loss_ce: 0.5699 decode.acc_seg: 73.4103 aux.loss_ce: 0.2767 aux.acc_seg: 72.8248 2024/10/25 13:35:08 - mmengine - INFO - Iter(train) [29800/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 3:18:41 time: 1.9463 data_time: 0.0164 memory: 6646 grad_norm: 6.0692 loss: 0.7037 decode.loss_ce: 0.4731 decode.acc_seg: 65.4594 aux.loss_ce: 0.2306 aux.acc_seg: 68.5975 2024/10/25 13:36:45 - mmengine - INFO - Iter(train) [29850/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 3:17:02 time: 1.9457 data_time: 0.0173 memory: 6646 grad_norm: 5.6268 loss: 0.7203 decode.loss_ce: 0.4804 decode.acc_seg: 81.8465 aux.loss_ce: 0.2398 aux.acc_seg: 70.0910 2024/10/25 13:38:23 - mmengine - INFO - Iter(train) [29900/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 3:15:24 time: 1.9582 data_time: 0.0170 memory: 6647 grad_norm: 5.1356 loss: 0.6588 decode.loss_ce: 0.4465 decode.acc_seg: 86.4846 aux.loss_ce: 0.2123 aux.acc_seg: 86.1041 2024/10/25 13:40:01 - mmengine - INFO - Iter(train) [29950/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 3:13:46 time: 1.9490 data_time: 0.0157 memory: 6645 grad_norm: 6.9448 loss: 0.7048 decode.loss_ce: 0.4741 decode.acc_seg: 85.9660 aux.loss_ce: 0.2307 aux.acc_seg: 82.8864 2024/10/25 13:41:38 - mmengine - INFO - Exp name: deeplabv3_mobilemamba_b4-80k_ade20k-512x512_20241024_212005 2024/10/25 13:41:38 - mmengine - INFO - Iter(train) [30000/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 3:12:07 time: 1.9455 data_time: 0.0166 memory: 6646 grad_norm: 5.0244 loss: 0.6792 decode.loss_ce: 0.4648 decode.acc_seg: 84.8141 aux.loss_ce: 0.2144 aux.acc_seg: 77.2886 2024/10/25 13:43:16 - mmengine - INFO - Iter(train) [30050/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 3:10:28 time: 1.9518 data_time: 0.0162 memory: 6645 grad_norm: 5.7594 loss: 0.6889 decode.loss_ce: 0.4584 decode.acc_seg: 81.1455 aux.loss_ce: 0.2305 aux.acc_seg: 81.7292 2024/10/25 13:44:53 - mmengine - INFO - Iter(train) [30100/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 3:08:49 time: 1.9429 data_time: 0.0168 memory: 6645 grad_norm: 5.6323 loss: 0.7144 decode.loss_ce: 0.4760 decode.acc_seg: 88.9035 aux.loss_ce: 0.2384 aux.acc_seg: 88.1241 2024/10/25 13:46:31 - mmengine - INFO - Iter(train) [30150/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 3:07:11 time: 1.9516 data_time: 0.0186 memory: 6647 grad_norm: 4.6026 loss: 0.7125 decode.loss_ce: 0.4785 decode.acc_seg: 80.7309 aux.loss_ce: 0.2339 aux.acc_seg: 79.2950 2024/10/25 13:48:09 - mmengine - INFO - Iter(train) [30200/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 3:05:32 time: 1.9440 data_time: 0.0167 memory: 6646 grad_norm: 5.6331 loss: 0.7940 decode.loss_ce: 0.5194 decode.acc_seg: 72.6290 aux.loss_ce: 0.2747 aux.acc_seg: 66.9187 2024/10/25 13:49:46 - mmengine - INFO - Iter(train) [30250/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 3:03:54 time: 1.9426 data_time: 0.0167 memory: 6648 grad_norm: 8.1265 loss: 0.7429 decode.loss_ce: 0.4893 decode.acc_seg: 81.8852 aux.loss_ce: 0.2536 aux.acc_seg: 84.5036 2024/10/25 13:51:23 - mmengine - INFO - Iter(train) [30300/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 3:02:15 time: 1.9517 data_time: 0.0166 memory: 6646 grad_norm: 7.3555 loss: 0.5765 decode.loss_ce: 0.3828 decode.acc_seg: 86.3359 aux.loss_ce: 0.1937 aux.acc_seg: 84.0057 2024/10/25 13:53:01 - mmengine - INFO - Iter(train) [30350/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 3:00:37 time: 1.9455 data_time: 0.0176 memory: 6645 grad_norm: 6.7236 loss: 0.6826 decode.loss_ce: 0.4671 decode.acc_seg: 87.9639 aux.loss_ce: 0.2156 aux.acc_seg: 87.6027 2024/10/25 13:54:42 - mmengine - INFO - Iter(train) [30400/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 2:59:04 time: 1.9386 data_time: 0.0171 memory: 6647 grad_norm: 6.4574 loss: 0.6839 decode.loss_ce: 0.4390 decode.acc_seg: 78.7814 aux.loss_ce: 0.2449 aux.acc_seg: 61.7961 2024/10/25 13:56:20 - mmengine - INFO - Iter(train) [30450/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 2:57:26 time: 1.9743 data_time: 0.0160 memory: 6645 grad_norm: 5.7795 loss: 0.7936 decode.loss_ce: 0.5349 decode.acc_seg: 74.8422 aux.loss_ce: 0.2586 aux.acc_seg: 73.8076 2024/10/25 13:57:58 - mmengine - INFO - Iter(train) [30500/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 2:55:47 time: 1.9502 data_time: 0.0169 memory: 6645 grad_norm: 4.5906 loss: 0.6987 decode.loss_ce: 0.4321 decode.acc_seg: 77.8133 aux.loss_ce: 0.2666 aux.acc_seg: 67.1415 2024/10/25 13:59:35 - mmengine - INFO - Iter(train) [30550/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 2:54:09 time: 1.9491 data_time: 0.0164 memory: 6645 grad_norm: 7.6802 loss: 0.7690 decode.loss_ce: 0.4965 decode.acc_seg: 69.0648 aux.loss_ce: 0.2725 aux.acc_seg: 65.3685 2024/10/25 14:01:13 - mmengine - INFO - Iter(train) [30600/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 2:52:31 time: 1.9429 data_time: 0.0164 memory: 6645 grad_norm: 6.8468 loss: 0.7992 decode.loss_ce: 0.5409 decode.acc_seg: 78.3267 aux.loss_ce: 0.2583 aux.acc_seg: 75.9808 2024/10/25 14:02:50 - mmengine - INFO - Iter(train) [30650/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 2:50:51 time: 1.9452 data_time: 0.0156 memory: 6644 grad_norm: 7.2054 loss: 0.6629 decode.loss_ce: 0.4343 decode.acc_seg: 82.0696 aux.loss_ce: 0.2285 aux.acc_seg: 80.5999 2024/10/25 14:04:28 - mmengine - INFO - Iter(train) [30700/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 2:49:13 time: 1.9479 data_time: 0.0176 memory: 6645 grad_norm: 5.7548 loss: 0.7087 decode.loss_ce: 0.4543 decode.acc_seg: 71.3503 aux.loss_ce: 0.2544 aux.acc_seg: 67.3249 2024/10/25 14:06:05 - mmengine - INFO - Iter(train) [30750/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 2:47:34 time: 1.9543 data_time: 0.0174 memory: 6647 grad_norm: 7.2028 loss: 0.6633 decode.loss_ce: 0.4499 decode.acc_seg: 86.3848 aux.loss_ce: 0.2134 aux.acc_seg: 86.2125 2024/10/25 14:07:43 - mmengine - INFO - Iter(train) [30800/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 2:45:56 time: 1.9627 data_time: 0.0170 memory: 6645 grad_norm: 5.8636 loss: 0.7539 decode.loss_ce: 0.5111 decode.acc_seg: 72.7156 aux.loss_ce: 0.2428 aux.acc_seg: 72.5195 2024/10/25 14:09:21 - mmengine - INFO - Iter(train) [30850/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 2:44:18 time: 1.9575 data_time: 0.0179 memory: 6646 grad_norm: 5.6946 loss: 0.7146 decode.loss_ce: 0.4744 decode.acc_seg: 86.1247 aux.loss_ce: 0.2402 aux.acc_seg: 84.8832 2024/10/25 14:10:59 - mmengine - INFO - Iter(train) [30900/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 2:42:40 time: 1.9484 data_time: 0.0167 memory: 6646 grad_norm: 6.1002 loss: 0.6044 decode.loss_ce: 0.4099 decode.acc_seg: 87.7928 aux.loss_ce: 0.1945 aux.acc_seg: 87.2472 2024/10/25 14:12:36 - mmengine - INFO - Iter(train) [30950/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 2:41:02 time: 1.9502 data_time: 0.0171 memory: 6645 grad_norm: 5.1196 loss: 0.5829 decode.loss_ce: 0.3824 decode.acc_seg: 86.3721 aux.loss_ce: 0.2005 aux.acc_seg: 85.0497 2024/10/25 14:14:14 - mmengine - INFO - Exp name: deeplabv3_mobilemamba_b4-80k_ade20k-512x512_20241024_212005 2024/10/25 14:14:14 - mmengine - INFO - Iter(train) [31000/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 2:39:24 time: 1.9401 data_time: 0.0164 memory: 6645 grad_norm: 6.4866 loss: 0.6669 decode.loss_ce: 0.4373 decode.acc_seg: 86.6493 aux.loss_ce: 0.2297 aux.acc_seg: 77.5326 2024/10/25 14:15:52 - mmengine - INFO - Iter(train) [31050/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 2:37:45 time: 1.9487 data_time: 0.0178 memory: 6647 grad_norm: 7.7006 loss: 0.7080 decode.loss_ce: 0.4739 decode.acc_seg: 84.4602 aux.loss_ce: 0.2341 aux.acc_seg: 84.1628 2024/10/25 14:17:29 - mmengine - INFO - Iter(train) [31100/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 2:36:07 time: 1.9571 data_time: 0.0174 memory: 6645 grad_norm: 5.8067 loss: 0.7349 decode.loss_ce: 0.4795 decode.acc_seg: 82.0495 aux.loss_ce: 0.2554 aux.acc_seg: 80.3129 2024/10/25 14:19:07 - mmengine - INFO - Iter(train) [31150/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 2:34:29 time: 1.9488 data_time: 0.0171 memory: 6645 grad_norm: 5.0865 loss: 0.7016 decode.loss_ce: 0.4737 decode.acc_seg: 78.6875 aux.loss_ce: 0.2279 aux.acc_seg: 76.3793 2024/10/25 14:20:45 - mmengine - INFO - Iter(train) [31200/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 2:32:51 time: 1.9445 data_time: 0.0177 memory: 6645 grad_norm: 4.7716 loss: 0.6289 decode.loss_ce: 0.4043 decode.acc_seg: 82.3064 aux.loss_ce: 0.2246 aux.acc_seg: 76.5792 2024/10/25 14:22:23 - mmengine - INFO - Iter(train) [31250/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 2:31:12 time: 1.9474 data_time: 0.0175 memory: 6646 grad_norm: 4.7186 loss: 0.8226 decode.loss_ce: 0.5432 decode.acc_seg: 77.6389 aux.loss_ce: 0.2794 aux.acc_seg: 74.0567 2024/10/25 14:24:00 - mmengine - INFO - Iter(train) [31300/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 2:29:34 time: 1.9626 data_time: 0.0170 memory: 6645 grad_norm: 7.0446 loss: 0.7057 decode.loss_ce: 0.4681 decode.acc_seg: 85.9752 aux.loss_ce: 0.2376 aux.acc_seg: 78.7822 2024/10/25 14:25:38 - mmengine - INFO - Iter(train) [31350/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 2:27:56 time: 1.9580 data_time: 0.0168 memory: 6646 grad_norm: 4.9552 loss: 0.6803 decode.loss_ce: 0.4564 decode.acc_seg: 78.6904 aux.loss_ce: 0.2239 aux.acc_seg: 74.2496 2024/10/25 14:27:15 - mmengine - INFO - Iter(train) [31400/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 2:26:17 time: 1.9488 data_time: 0.0156 memory: 6646 grad_norm: 4.6304 loss: 0.7369 decode.loss_ce: 0.4856 decode.acc_seg: 84.1007 aux.loss_ce: 0.2513 aux.acc_seg: 77.4750 2024/10/25 14:28:53 - mmengine - INFO - Iter(train) [31450/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 2:24:38 time: 1.9506 data_time: 0.0162 memory: 6646 grad_norm: 5.4350 loss: 0.7191 decode.loss_ce: 0.4888 decode.acc_seg: 73.9459 aux.loss_ce: 0.2303 aux.acc_seg: 74.2361 2024/10/25 14:30:31 - mmengine - INFO - Iter(train) [31500/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 2:23:01 time: 1.9494 data_time: 0.0166 memory: 6646 grad_norm: 4.2243 loss: 0.6479 decode.loss_ce: 0.4286 decode.acc_seg: 86.0187 aux.loss_ce: 0.2193 aux.acc_seg: 77.0779 2024/10/25 14:32:09 - mmengine - INFO - Iter(train) [31550/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 2:21:23 time: 1.9555 data_time: 0.0150 memory: 6646 grad_norm: 5.1825 loss: 0.8022 decode.loss_ce: 0.5481 decode.acc_seg: 78.2126 aux.loss_ce: 0.2541 aux.acc_seg: 75.4091 2024/10/25 14:33:46 - mmengine - INFO - Iter(train) [31600/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 2:19:44 time: 1.9466 data_time: 0.0166 memory: 6645 grad_norm: 6.7451 loss: 0.6600 decode.loss_ce: 0.4174 decode.acc_seg: 79.8623 aux.loss_ce: 0.2426 aux.acc_seg: 72.6582 2024/10/25 14:35:24 - mmengine - INFO - Iter(train) [31650/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 2:18:05 time: 1.9495 data_time: 0.0163 memory: 6646 grad_norm: 6.4335 loss: 0.6648 decode.loss_ce: 0.4327 decode.acc_seg: 82.4564 aux.loss_ce: 0.2322 aux.acc_seg: 71.3400 2024/10/25 14:37:01 - mmengine - INFO - Iter(train) [31700/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 2:16:27 time: 1.9476 data_time: 0.0168 memory: 6646 grad_norm: 4.9155 loss: 0.6966 decode.loss_ce: 0.4680 decode.acc_seg: 80.7343 aux.loss_ce: 0.2286 aux.acc_seg: 80.0410 2024/10/25 14:38:43 - mmengine - INFO - Iter(train) [31750/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 2:14:54 time: 1.9492 data_time: 0.0160 memory: 6645 grad_norm: 5.9033 loss: 0.7070 decode.loss_ce: 0.4711 decode.acc_seg: 82.0182 aux.loss_ce: 0.2360 aux.acc_seg: 78.7653 2024/10/25 14:40:21 - mmengine - INFO - Iter(train) [31800/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 2:13:16 time: 1.9636 data_time: 0.0163 memory: 6646 grad_norm: 4.8437 loss: 0.6635 decode.loss_ce: 0.4361 decode.acc_seg: 86.8910 aux.loss_ce: 0.2274 aux.acc_seg: 86.7546 2024/10/25 14:41:58 - mmengine - INFO - Iter(train) [31850/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 2:11:38 time: 1.9581 data_time: 0.0165 memory: 6644 grad_norm: 6.3132 loss: 0.6499 decode.loss_ce: 0.4235 decode.acc_seg: 86.0716 aux.loss_ce: 0.2264 aux.acc_seg: 82.6229 2024/10/25 14:43:36 - mmengine - INFO - Iter(train) [31900/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 2:10:00 time: 1.9675 data_time: 0.0173 memory: 6645 grad_norm: 7.8340 loss: 0.7058 decode.loss_ce: 0.4635 decode.acc_seg: 84.8038 aux.loss_ce: 0.2423 aux.acc_seg: 82.7848 2024/10/25 14:45:13 - mmengine - INFO - Iter(train) [31950/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 2:08:21 time: 1.9491 data_time: 0.0173 memory: 6646 grad_norm: 6.0301 loss: 0.6844 decode.loss_ce: 0.4485 decode.acc_seg: 81.3703 aux.loss_ce: 0.2359 aux.acc_seg: 73.9404 2024/10/25 14:46:51 - mmengine - INFO - Exp name: deeplabv3_mobilemamba_b4-80k_ade20k-512x512_20241024_212005 2024/10/25 14:46:51 - mmengine - INFO - Iter(train) [32000/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 2:06:43 time: 1.9564 data_time: 0.0175 memory: 6648 grad_norm: 6.6355 loss: 0.7860 decode.loss_ce: 0.5131 decode.acc_seg: 88.2057 aux.loss_ce: 0.2729 aux.acc_seg: 87.9298 2024/10/25 14:46:51 - mmengine - INFO - Saving checkpoint at 32000 iterations 2024/10/25 14:46:56 - mmengine - INFO - Iter(val) [ 50/500] eta: 0:00:15 time: 0.0333 data_time: 0.0018 memory: 1049 2024/10/25 14:46:58 - mmengine - INFO - Iter(val) [100/500] eta: 0:00:13 time: 0.0343 data_time: 0.0017 memory: 1117 2024/10/25 14:46:59 - mmengine - INFO - Iter(val) [150/500] eta: 0:00:11 time: 0.0351 data_time: 0.0019 memory: 833 2024/10/25 14:47:01 - mmengine - INFO - Iter(val) [200/500] eta: 0:00:10 time: 0.0341 data_time: 0.0017 memory: 866 2024/10/25 14:47:03 - mmengine - INFO - Iter(val) [250/500] eta: 0:00:08 time: 0.0347 data_time: 0.0019 memory: 906 2024/10/25 14:47:04 - mmengine - INFO - Iter(val) [300/500] eta: 0:00:06 time: 0.0355 data_time: 0.0023 memory: 2028 2024/10/25 14:47:06 - mmengine - INFO - Iter(val) [350/500] eta: 0:00:05 time: 0.0333 data_time: 0.0017 memory: 832 2024/10/25 14:47:08 - mmengine - INFO - Iter(val) [400/500] eta: 0:00:03 time: 0.0353 data_time: 0.0021 memory: 904 2024/10/25 14:47:10 - mmengine - INFO - Iter(val) [450/500] eta: 0:00:01 time: 0.0320 data_time: 0.0015 memory: 839 2024/10/25 14:47:11 - mmengine - INFO - Iter(val) [500/500] eta: 0:00:00 time: 0.0320 data_time: 0.0015 memory: 889 2024/10/25 14:47:12 - mmengine - INFO - per class results: 2024/10/25 14:47:12 - mmengine - INFO - +---------------------+-------+-------+ | Class | IoU | Acc | +---------------------+-------+-------+ | wall | 65.94 | 85.27 | | building | 75.73 | 86.42 | | sky | 88.46 | 94.55 | | floor | 69.65 | 84.09 | | tree | 64.79 | 83.06 | | ceiling | 75.67 | 85.94 | | road | 74.01 | 83.14 | | bed | 77.94 | 88.48 | | windowpane | 50.74 | 71.1 | | grass | 57.33 | 72.31 | | cabinet | 50.4 | 61.45 | | sidewalk | 51.94 | 66.86 | | person | 59.26 | 76.61 | | earth | 28.68 | 45.11 | | door | 32.46 | 45.74 | | table | 42.47 | 57.85 | | mountain | 48.5 | 66.25 | | plant | 42.59 | 51.76 | | curtain | 53.63 | 65.0 | | chair | 37.77 | 52.1 | | car | 66.19 | 81.25 | | water | 33.95 | 43.54 | | painting | 50.3 | 67.1 | | sofa | 51.98 | 77.49 | | shelf | 24.49 | 34.08 | | house | 36.67 | 53.08 | | sea | 40.43 | 70.32 | | mirror | 48.61 | 58.0 | | rug | 48.82 | 64.08 | | field | 21.84 | 41.66 | | armchair | 29.41 | 41.46 | | seat | 58.67 | 78.05 | | fence | 34.84 | 50.53 | | desk | 31.58 | 51.08 | | rock | 19.99 | 29.96 | | wardrobe | 32.24 | 43.1 | | lamp | 27.39 | 34.22 | | bathtub | 66.34 | 77.11 | | railing | 23.23 | 32.18 | | cushion | 33.16 | 45.81 | | base | 11.07 | 16.05 | | box | 16.13 | 35.06 | | column | 30.23 | 39.18 | | signboard | 18.25 | 24.79 | | chest of drawers | 28.73 | 53.83 | | counter | 30.63 | 36.67 | | sand | 31.4 | 48.7 | | sink | 50.74 | 61.82 | | skyscraper | 48.2 | 73.71 | | fireplace | 61.21 | 73.23 | | refrigerator | 62.16 | 70.97 | | grandstand | 31.39 | 46.0 | | path | 16.54 | 30.45 | | stairs | 22.39 | 26.18 | | runway | 64.79 | 83.21 | | case | 39.51 | 54.27 | | pool table | 72.06 | 77.21 | | pillow | 38.18 | 50.24 | | screen door | 51.73 | 56.49 | | stairway | 24.25 | 35.55 | | river | 4.62 | 7.48 | | bridge | 55.57 | 63.78 | | bookcase | 20.24 | 27.59 | | blind | 17.25 | 17.98 | | coffee table | 44.4 | 69.62 | | toilet | 60.9 | 72.5 | | flower | 23.24 | 31.51 | | book | 36.03 | 55.15 | | hill | 4.08 | 6.88 | | bench | 30.99 | 44.8 | | countertop | 36.81 | 44.39 | | stove | 48.74 | 68.24 | | palm | 31.54 | 46.26 | | kitchen island | 20.18 | 30.91 | | computer | 44.12 | 58.23 | | swivel chair | 35.5 | 47.59 | | boat | 60.17 | 73.9 | | bar | 26.56 | 30.25 | | arcade machine | 22.78 | 26.3 | | hovel | 24.19 | 28.26 | | bus | 71.84 | 80.11 | | towel | 34.49 | 52.01 | | light | 8.42 | 9.12 | | truck | 10.95 | 23.22 | | tower | 30.88 | 65.26 | | chandelier | 46.03 | 66.68 | | awning | 17.98 | 22.27 | | streetlight | 3.6 | 4.23 | | booth | 34.73 | 36.31 | | television receiver | 50.99 | 62.55 | | airplane | 32.99 | 45.21 | | dirt track | 2.75 | 28.14 | | apparel | 16.61 | 28.58 | | pole | 8.56 | 10.35 | | land | 0.0 | 0.0 | | bannister | 1.59 | 1.77 | | escalator | 44.78 | 62.41 | | ottoman | 32.88 | 50.94 | | bottle | 9.61 | 12.53 | | buffet | 39.47 | 42.85 | | poster | 20.16 | 23.61 | | stage | 8.58 | 12.42 | | van | 21.24 | 25.13 | | ship | 43.25 | 45.52 | | fountain | 2.35 | 2.36 | | conveyer belt | 44.09 | 80.19 | | canopy | 9.73 | 11.89 | | washer | 54.99 | 66.2 | | plaything | 10.29 | 21.1 | | swimming pool | 41.62 | 67.59 | | stool | 16.43 | 20.77 | | barrel | 20.28 | 60.31 | | basket | 10.76 | 13.04 | | waterfall | 43.61 | 58.63 | | tent | 81.93 | 90.61 | | bag | 3.84 | 4.24 | | minibike | 48.32 | 62.44 | | cradle | 55.44 | 73.94 | | oven | 21.06 | 48.53 | | ball | 2.46 | 3.48 | | food | 36.62 | 47.98 | | step | 5.23 | 6.73 | | tank | 30.73 | 44.98 | | trade name | 7.41 | 8.11 | | microwave | 32.6 | 39.42 | | pot | 24.58 | 28.59 | | animal | 37.17 | 42.86 | | bicycle | 31.2 | 49.3 | | lake | 3.47 | 7.34 | | dishwasher | 43.69 | 52.85 | | screen | 58.89 | 79.26 | | blanket | 9.93 | 12.06 | | sculpture | 40.95 | 44.05 | | hood | 34.33 | 37.57 | | sconce | 16.43 | 20.46 | | vase | 13.72 | 19.15 | | traffic light | 10.16 | 16.87 | | tray | 0.89 | 2.1 | | ashcan | 16.55 | 26.89 | | fan | 28.19 | 40.79 | | pier | 52.69 | 72.05 | | crt screen | 0.0 | 0.0 | | plate | 20.62 | 27.98 | | monitor | 1.19 | 1.24 | | bulletin board | 28.41 | 36.64 | | shower | 0.0 | 0.0 | | radiator | 33.98 | 39.92 | | glass | 0.59 | 0.61 | | clock | 1.28 | 1.34 | | flag | 16.26 | 18.49 | +---------------------+-------+-------+ 2024/10/25 14:47:12 - mmengine - INFO - Iter(val) [500/500] aAcc: 74.5400 mIoU: 33.4900 mAcc: 44.4300 data_time: 0.0018 time: 0.0340 2024/10/25 14:48:50 - mmengine - INFO - Iter(train) [32050/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 2:05:05 time: 1.9524 data_time: 0.0159 memory: 6645 grad_norm: 5.4768 loss: 0.7114 decode.loss_ce: 0.4818 decode.acc_seg: 82.1138 aux.loss_ce: 0.2296 aux.acc_seg: 79.9907 2024/10/25 14:50:27 - mmengine - INFO - Iter(train) [32100/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 2:03:27 time: 1.9489 data_time: 0.0170 memory: 6646 grad_norm: 5.7119 loss: 0.7601 decode.loss_ce: 0.5053 decode.acc_seg: 88.2059 aux.loss_ce: 0.2548 aux.acc_seg: 84.0189 2024/10/25 14:52:05 - mmengine - INFO - Iter(train) [32150/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 2:01:49 time: 1.9765 data_time: 0.0155 memory: 6646 grad_norm: 5.3119 loss: 0.6788 decode.loss_ce: 0.4477 decode.acc_seg: 82.2194 aux.loss_ce: 0.2312 aux.acc_seg: 74.8356 2024/10/25 14:53:43 - mmengine - INFO - Iter(train) [32200/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 2:00:11 time: 1.9442 data_time: 0.0160 memory: 6645 grad_norm: 5.2515 loss: 0.6889 decode.loss_ce: 0.4568 decode.acc_seg: 78.6168 aux.loss_ce: 0.2320 aux.acc_seg: 73.2262 2024/10/25 14:55:21 - mmengine - INFO - Iter(train) [32250/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 1:58:33 time: 1.9565 data_time: 0.0161 memory: 6646 grad_norm: 5.9566 loss: 0.7175 decode.loss_ce: 0.4685 decode.acc_seg: 86.2285 aux.loss_ce: 0.2491 aux.acc_seg: 81.3696 2024/10/25 14:56:58 - mmengine - INFO - Iter(train) [32300/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 1:56:54 time: 1.9450 data_time: 0.0178 memory: 6644 grad_norm: 5.8437 loss: 0.6192 decode.loss_ce: 0.4173 decode.acc_seg: 82.7647 aux.loss_ce: 0.2019 aux.acc_seg: 82.7958 2024/10/25 14:58:36 - mmengine - INFO - Iter(train) [32350/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 1:55:16 time: 1.9600 data_time: 0.0167 memory: 6647 grad_norm: 5.3494 loss: 0.7119 decode.loss_ce: 0.4692 decode.acc_seg: 85.3236 aux.loss_ce: 0.2426 aux.acc_seg: 79.9235 2024/10/25 15:00:13 - mmengine - INFO - Iter(train) [32400/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 1:53:37 time: 1.9494 data_time: 0.0161 memory: 6646 grad_norm: 4.5705 loss: 0.6249 decode.loss_ce: 0.4198 decode.acc_seg: 75.9567 aux.loss_ce: 0.2051 aux.acc_seg: 71.4712 2024/10/25 15:01:51 - mmengine - INFO - Iter(train) [32450/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 1:51:59 time: 1.9556 data_time: 0.0171 memory: 6646 grad_norm: 7.3003 loss: 0.6842 decode.loss_ce: 0.4480 decode.acc_seg: 79.5390 aux.loss_ce: 0.2362 aux.acc_seg: 73.3601 2024/10/25 15:03:29 - mmengine - INFO - Iter(train) [32500/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 1:50:20 time: 1.9390 data_time: 0.0159 memory: 6645 grad_norm: 6.6109 loss: 0.6399 decode.loss_ce: 0.4221 decode.acc_seg: 89.8374 aux.loss_ce: 0.2178 aux.acc_seg: 85.1049 2024/10/25 15:05:06 - mmengine - INFO - Iter(train) [32550/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 1:48:42 time: 1.9475 data_time: 0.0165 memory: 6645 grad_norm: 4.2623 loss: 0.6176 decode.loss_ce: 0.4152 decode.acc_seg: 87.4644 aux.loss_ce: 0.2023 aux.acc_seg: 84.6545 2024/10/25 15:06:44 - mmengine - INFO - Iter(train) [32600/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 1:47:04 time: 1.9550 data_time: 0.0159 memory: 6645 grad_norm: 5.9905 loss: 0.6951 decode.loss_ce: 0.4600 decode.acc_seg: 88.6610 aux.loss_ce: 0.2350 aux.acc_seg: 82.4502 2024/10/25 15:08:22 - mmengine - INFO - Iter(train) [32650/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 1:45:26 time: 1.9504 data_time: 0.0158 memory: 6645 grad_norm: 5.3956 loss: 0.6293 decode.loss_ce: 0.4217 decode.acc_seg: 84.8072 aux.loss_ce: 0.2076 aux.acc_seg: 83.4759 2024/10/25 15:09:59 - mmengine - INFO - Iter(train) [32700/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 1:43:47 time: 1.9472 data_time: 0.0170 memory: 6648 grad_norm: 4.8277 loss: 0.6477 decode.loss_ce: 0.4291 decode.acc_seg: 86.0698 aux.loss_ce: 0.2186 aux.acc_seg: 87.9112 2024/10/25 15:11:37 - mmengine - INFO - Iter(train) [32750/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 1:42:09 time: 1.9692 data_time: 0.0163 memory: 6645 grad_norm: 6.3947 loss: 0.6644 decode.loss_ce: 0.4586 decode.acc_seg: 85.2907 aux.loss_ce: 0.2058 aux.acc_seg: 87.8243 2024/10/25 15:13:14 - mmengine - INFO - Iter(train) [32800/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 1:40:30 time: 1.9479 data_time: 0.0169 memory: 6645 grad_norm: 6.0347 loss: 0.6655 decode.loss_ce: 0.4494 decode.acc_seg: 85.1736 aux.loss_ce: 0.2161 aux.acc_seg: 87.2985 2024/10/25 15:14:52 - mmengine - INFO - Iter(train) [32850/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 1:38:52 time: 1.9636 data_time: 0.0161 memory: 6645 grad_norm: 5.3838 loss: 0.6656 decode.loss_ce: 0.4302 decode.acc_seg: 82.3518 aux.loss_ce: 0.2355 aux.acc_seg: 77.4911 2024/10/25 15:16:30 - mmengine - INFO - Iter(train) [32900/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 1:37:14 time: 1.9450 data_time: 0.0161 memory: 6645 grad_norm: 6.2655 loss: 0.7948 decode.loss_ce: 0.5368 decode.acc_seg: 82.0825 aux.loss_ce: 0.2580 aux.acc_seg: 80.0450 2024/10/25 15:18:07 - mmengine - INFO - Iter(train) [32950/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 1:35:35 time: 1.9557 data_time: 0.0167 memory: 6646 grad_norm: 5.7634 loss: 0.7440 decode.loss_ce: 0.4855 decode.acc_seg: 85.2598 aux.loss_ce: 0.2585 aux.acc_seg: 83.3608 2024/10/25 15:19:45 - mmengine - INFO - Exp name: deeplabv3_mobilemamba_b4-80k_ade20k-512x512_20241024_212005 2024/10/25 15:19:45 - mmengine - INFO - Iter(train) [33000/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 1:33:57 time: 1.9678 data_time: 0.0167 memory: 6645 grad_norm: 5.3256 loss: 0.8599 decode.loss_ce: 0.5701 decode.acc_seg: 76.1687 aux.loss_ce: 0.2899 aux.acc_seg: 71.1388 2024/10/25 15:21:22 - mmengine - INFO - Iter(train) [33050/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 1:32:18 time: 1.9402 data_time: 0.0174 memory: 6646 grad_norm: 4.4359 loss: 0.6989 decode.loss_ce: 0.4619 decode.acc_seg: 87.6845 aux.loss_ce: 0.2370 aux.acc_seg: 80.7493 2024/10/25 15:22:59 - mmengine - INFO - Iter(train) [33100/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 1:30:39 time: 1.9531 data_time: 0.0164 memory: 6645 grad_norm: 7.8427 loss: 0.7694 decode.loss_ce: 0.5196 decode.acc_seg: 77.4910 aux.loss_ce: 0.2498 aux.acc_seg: 74.7009 2024/10/25 15:24:37 - mmengine - INFO - Iter(train) [33150/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 1:29:01 time: 1.9482 data_time: 0.0179 memory: 6646 grad_norm: 9.3650 loss: 0.5541 decode.loss_ce: 0.3560 decode.acc_seg: 90.7161 aux.loss_ce: 0.1981 aux.acc_seg: 79.4345 2024/10/25 15:26:14 - mmengine - INFO - Iter(train) [33200/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 1:27:22 time: 1.9504 data_time: 0.0171 memory: 6645 grad_norm: 7.0552 loss: 0.7652 decode.loss_ce: 0.5115 decode.acc_seg: 72.6617 aux.loss_ce: 0.2537 aux.acc_seg: 66.0576 2024/10/25 15:27:52 - mmengine - INFO - Iter(train) [33250/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 1:25:44 time: 1.9510 data_time: 0.0174 memory: 6645 grad_norm: 6.0542 loss: 0.7110 decode.loss_ce: 0.4741 decode.acc_seg: 77.8551 aux.loss_ce: 0.2369 aux.acc_seg: 70.8303 2024/10/25 15:29:29 - mmengine - INFO - Iter(train) [33300/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 1:24:05 time: 1.9465 data_time: 0.0178 memory: 6645 grad_norm: 5.0211 loss: 0.6634 decode.loss_ce: 0.4406 decode.acc_seg: 77.7659 aux.loss_ce: 0.2228 aux.acc_seg: 74.2454 2024/10/25 15:31:07 - mmengine - INFO - Iter(train) [33350/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 1:22:27 time: 1.9513 data_time: 0.0171 memory: 6645 grad_norm: 4.5781 loss: 0.8021 decode.loss_ce: 0.5467 decode.acc_seg: 84.9334 aux.loss_ce: 0.2554 aux.acc_seg: 85.4713 2024/10/25 15:32:44 - mmengine - INFO - Iter(train) [33400/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 1:20:48 time: 1.9479 data_time: 0.0172 memory: 6645 grad_norm: 5.2709 loss: 0.7390 decode.loss_ce: 0.4793 decode.acc_seg: 85.2714 aux.loss_ce: 0.2596 aux.acc_seg: 80.5618 2024/10/25 15:34:22 - mmengine - INFO - Iter(train) [33450/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 1:19:10 time: 1.9502 data_time: 0.0150 memory: 6647 grad_norm: 7.9562 loss: 0.7596 decode.loss_ce: 0.5142 decode.acc_seg: 72.3679 aux.loss_ce: 0.2454 aux.acc_seg: 67.8654 2024/10/25 15:36:00 - mmengine - INFO - Iter(train) [33500/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 1:17:32 time: 1.9439 data_time: 0.0162 memory: 6645 grad_norm: 5.2911 loss: 0.8101 decode.loss_ce: 0.5394 decode.acc_seg: 87.2482 aux.loss_ce: 0.2707 aux.acc_seg: 83.3445 2024/10/25 15:37:38 - mmengine - INFO - Iter(train) [33550/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 1:15:55 time: 1.9489 data_time: 0.0176 memory: 6645 grad_norm: 6.1539 loss: 0.6934 decode.loss_ce: 0.4692 decode.acc_seg: 87.9072 aux.loss_ce: 0.2241 aux.acc_seg: 85.2673 2024/10/25 15:39:15 - mmengine - INFO - Iter(train) [33600/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 1:14:16 time: 1.9440 data_time: 0.0162 memory: 6645 grad_norm: 4.8748 loss: 0.7737 decode.loss_ce: 0.5359 decode.acc_seg: 82.5867 aux.loss_ce: 0.2378 aux.acc_seg: 82.2107 2024/10/25 15:40:53 - mmengine - INFO - Iter(train) [33650/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 1:12:38 time: 1.9487 data_time: 0.0177 memory: 6645 grad_norm: 5.4847 loss: 0.7764 decode.loss_ce: 0.5221 decode.acc_seg: 81.0717 aux.loss_ce: 0.2544 aux.acc_seg: 78.4186 2024/10/25 15:42:31 - mmengine - INFO - Iter(train) [33700/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 1:11:00 time: 1.9792 data_time: 0.0153 memory: 6645 grad_norm: 4.3169 loss: 0.6170 decode.loss_ce: 0.4016 decode.acc_seg: 86.5727 aux.loss_ce: 0.2155 aux.acc_seg: 76.6784 2024/10/25 15:44:08 - mmengine - INFO - Iter(train) [33750/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 1:09:21 time: 1.9496 data_time: 0.0171 memory: 6646 grad_norm: 4.8755 loss: 0.6521 decode.loss_ce: 0.4384 decode.acc_seg: 76.6007 aux.loss_ce: 0.2138 aux.acc_seg: 76.0434 2024/10/25 15:45:46 - mmengine - INFO - Iter(train) [33800/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 1:07:43 time: 1.9552 data_time: 0.0163 memory: 6646 grad_norm: 5.5772 loss: 0.6264 decode.loss_ce: 0.4209 decode.acc_seg: 78.2755 aux.loss_ce: 0.2055 aux.acc_seg: 76.5900 2024/10/25 15:47:24 - mmengine - INFO - Iter(train) [33850/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 1:06:05 time: 1.9538 data_time: 0.0168 memory: 6646 grad_norm: 5.7963 loss: 0.6005 decode.loss_ce: 0.4035 decode.acc_seg: 82.8490 aux.loss_ce: 0.1970 aux.acc_seg: 83.7386 2024/10/25 15:49:02 - mmengine - INFO - Iter(train) [33900/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 1:04:27 time: 1.9670 data_time: 0.0161 memory: 6646 grad_norm: 4.5337 loss: 0.6966 decode.loss_ce: 0.4659 decode.acc_seg: 77.0127 aux.loss_ce: 0.2307 aux.acc_seg: 75.5198 2024/10/25 15:50:42 - mmengine - INFO - Iter(train) [33950/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 1:02:53 time: 1.9487 data_time: 0.0169 memory: 6647 grad_norm: 4.0055 loss: 0.6853 decode.loss_ce: 0.4709 decode.acc_seg: 78.9147 aux.loss_ce: 0.2144 aux.acc_seg: 77.7567 2024/10/25 15:52:20 - mmengine - INFO - Exp name: deeplabv3_mobilemamba_b4-80k_ade20k-512x512_20241024_212005 2024/10/25 15:52:20 - mmengine - INFO - Iter(train) [34000/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 1:01:14 time: 1.9433 data_time: 0.0163 memory: 6645 grad_norm: 6.3800 loss: 0.6390 decode.loss_ce: 0.4293 decode.acc_seg: 84.3741 aux.loss_ce: 0.2097 aux.acc_seg: 82.1224 2024/10/25 15:53:57 - mmengine - INFO - Iter(train) [34050/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 0:59:36 time: 1.9479 data_time: 0.0166 memory: 6647 grad_norm: 5.0337 loss: 0.7140 decode.loss_ce: 0.4727 decode.acc_seg: 81.0256 aux.loss_ce: 0.2414 aux.acc_seg: 76.9053 2024/10/25 15:55:35 - mmengine - INFO - Iter(train) [34100/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 0:57:57 time: 1.9492 data_time: 0.0176 memory: 6647 grad_norm: 6.7124 loss: 0.5992 decode.loss_ce: 0.4022 decode.acc_seg: 75.0582 aux.loss_ce: 0.1970 aux.acc_seg: 70.8352 2024/10/25 15:57:13 - mmengine - INFO - Iter(train) [34150/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 0:56:19 time: 1.9423 data_time: 0.0161 memory: 6646 grad_norm: 4.1994 loss: 0.6082 decode.loss_ce: 0.3953 decode.acc_seg: 81.4715 aux.loss_ce: 0.2129 aux.acc_seg: 81.4187 2024/10/25 15:58:51 - mmengine - INFO - Iter(train) [34200/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 0:54:41 time: 1.9508 data_time: 0.0175 memory: 6646 grad_norm: 4.9187 loss: 0.6132 decode.loss_ce: 0.4095 decode.acc_seg: 87.9271 aux.loss_ce: 0.2037 aux.acc_seg: 84.5448 2024/10/25 16:00:28 - mmengine - INFO - Iter(train) [34250/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 0:53:03 time: 1.9514 data_time: 0.0174 memory: 6645 grad_norm: 8.3540 loss: 0.6638 decode.loss_ce: 0.4426 decode.acc_seg: 71.7795 aux.loss_ce: 0.2212 aux.acc_seg: 66.1807 2024/10/25 16:02:06 - mmengine - INFO - Iter(train) [34300/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 0:51:25 time: 1.9449 data_time: 0.0175 memory: 6645 grad_norm: 4.7029 loss: 0.6087 decode.loss_ce: 0.4153 decode.acc_seg: 84.5688 aux.loss_ce: 0.1935 aux.acc_seg: 84.9456 2024/10/25 16:03:44 - mmengine - INFO - Iter(train) [34350/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 0:49:47 time: 1.9457 data_time: 0.0175 memory: 6645 grad_norm: 5.5073 loss: 0.7201 decode.loss_ce: 0.4880 decode.acc_seg: 79.5420 aux.loss_ce: 0.2322 aux.acc_seg: 73.3592 2024/10/25 16:05:22 - mmengine - INFO - Iter(train) [34400/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 0:48:09 time: 1.9498 data_time: 0.0175 memory: 6646 grad_norm: 7.6283 loss: 0.6594 decode.loss_ce: 0.4379 decode.acc_seg: 79.8529 aux.loss_ce: 0.2216 aux.acc_seg: 75.1390 2024/10/25 16:07:00 - mmengine - INFO - Iter(train) [34450/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 0:46:31 time: 1.9429 data_time: 0.0174 memory: 6646 grad_norm: 4.2888 loss: 0.6530 decode.loss_ce: 0.4321 decode.acc_seg: 81.4953 aux.loss_ce: 0.2209 aux.acc_seg: 77.9512 2024/10/25 16:08:37 - mmengine - INFO - Iter(train) [34500/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 0:44:53 time: 1.9494 data_time: 0.0168 memory: 6646 grad_norm: 5.2891 loss: 0.7105 decode.loss_ce: 0.4604 decode.acc_seg: 76.2880 aux.loss_ce: 0.2501 aux.acc_seg: 68.5983 2024/10/25 16:10:15 - mmengine - INFO - Iter(train) [34550/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 0:43:14 time: 1.9434 data_time: 0.0179 memory: 6645 grad_norm: 4.5556 loss: 0.5005 decode.loss_ce: 0.3339 decode.acc_seg: 86.0822 aux.loss_ce: 0.1666 aux.acc_seg: 85.5504 2024/10/25 16:11:52 - mmengine - INFO - Iter(train) [34600/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 0:41:36 time: 1.9518 data_time: 0.0166 memory: 6645 grad_norm: 6.4584 loss: 0.7372 decode.loss_ce: 0.4949 decode.acc_seg: 84.0045 aux.loss_ce: 0.2423 aux.acc_seg: 81.8889 2024/10/25 16:13:30 - mmengine - INFO - Iter(train) [34650/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 0:39:57 time: 1.9469 data_time: 0.0170 memory: 6645 grad_norm: 6.5181 loss: 0.6329 decode.loss_ce: 0.4260 decode.acc_seg: 84.6614 aux.loss_ce: 0.2069 aux.acc_seg: 84.4646 2024/10/25 16:15:08 - mmengine - INFO - Iter(train) [34700/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 0:38:19 time: 1.9657 data_time: 0.0163 memory: 6645 grad_norm: 5.1778 loss: 0.6608 decode.loss_ce: 0.4451 decode.acc_seg: 74.6952 aux.loss_ce: 0.2156 aux.acc_seg: 73.4205 2024/10/25 16:16:45 - mmengine - INFO - Iter(train) [34750/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 0:36:41 time: 1.9429 data_time: 0.0163 memory: 6645 grad_norm: 5.0329 loss: 0.5772 decode.loss_ce: 0.3809 decode.acc_seg: 86.7929 aux.loss_ce: 0.1963 aux.acc_seg: 83.2002 2024/10/25 16:18:23 - mmengine - INFO - Iter(train) [34800/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 0:35:03 time: 1.9572 data_time: 0.0168 memory: 6645 grad_norm: 4.1235 loss: 0.6761 decode.loss_ce: 0.4636 decode.acc_seg: 79.7372 aux.loss_ce: 0.2125 aux.acc_seg: 78.6454 2024/10/25 16:20:00 - mmengine - INFO - Iter(train) [34850/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 0:33:24 time: 1.9472 data_time: 0.0170 memory: 6646 grad_norm: 4.1229 loss: 0.6312 decode.loss_ce: 0.4204 decode.acc_seg: 73.3262 aux.loss_ce: 0.2108 aux.acc_seg: 69.1006 2024/10/25 16:21:38 - mmengine - INFO - Iter(train) [34900/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 0:31:47 time: 1.9424 data_time: 0.0170 memory: 6646 grad_norm: 7.2778 loss: 0.7091 decode.loss_ce: 0.4818 decode.acc_seg: 81.2437 aux.loss_ce: 0.2272 aux.acc_seg: 77.7424 2024/10/25 16:23:16 - mmengine - INFO - Iter(train) [34950/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 0:30:08 time: 1.9480 data_time: 0.0173 memory: 6645 grad_norm: 6.3233 loss: 0.7842 decode.loss_ce: 0.5126 decode.acc_seg: 80.1637 aux.loss_ce: 0.2716 aux.acc_seg: 74.0955 2024/10/25 16:24:53 - mmengine - INFO - Exp name: deeplabv3_mobilemamba_b4-80k_ade20k-512x512_20241024_212005 2024/10/25 16:24:53 - mmengine - INFO - Iter(train) [35000/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 0:28:30 time: 1.9374 data_time: 0.0164 memory: 6647 grad_norm: 6.2164 loss: 0.7429 decode.loss_ce: 0.5020 decode.acc_seg: 73.7295 aux.loss_ce: 0.2410 aux.acc_seg: 70.9670 2024/10/25 16:26:30 - mmengine - INFO - Iter(train) [35050/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 0:26:51 time: 1.9436 data_time: 0.0180 memory: 6646 grad_norm: 5.0055 loss: 0.6315 decode.loss_ce: 0.4142 decode.acc_seg: 84.5662 aux.loss_ce: 0.2173 aux.acc_seg: 80.4523 2024/10/25 16:28:08 - mmengine - INFO - Iter(train) [35100/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 0:25:12 time: 1.9476 data_time: 0.0162 memory: 6646 grad_norm: 5.2195 loss: 0.6451 decode.loss_ce: 0.4363 decode.acc_seg: 86.0865 aux.loss_ce: 0.2088 aux.acc_seg: 76.3155 2024/10/25 16:29:46 - mmengine - INFO - Iter(train) [35150/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 0:23:34 time: 1.9471 data_time: 0.0184 memory: 6647 grad_norm: 7.2224 loss: 0.7928 decode.loss_ce: 0.5260 decode.acc_seg: 88.4839 aux.loss_ce: 0.2669 aux.acc_seg: 76.1517 2024/10/25 16:31:23 - mmengine - INFO - Iter(train) [35200/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 0:21:56 time: 1.9574 data_time: 0.0161 memory: 6645 grad_norm: 4.1471 loss: 0.6544 decode.loss_ce: 0.4440 decode.acc_seg: 76.8362 aux.loss_ce: 0.2104 aux.acc_seg: 75.2158 2024/10/25 16:33:00 - mmengine - INFO - Iter(train) [35250/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 0:20:17 time: 1.9413 data_time: 0.0161 memory: 6647 grad_norm: 5.1750 loss: 0.6496 decode.loss_ce: 0.4389 decode.acc_seg: 79.4236 aux.loss_ce: 0.2107 aux.acc_seg: 77.1474 2024/10/25 16:34:38 - mmengine - INFO - Iter(train) [35300/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 0:18:39 time: 1.9461 data_time: 0.0161 memory: 6645 grad_norm: 4.7411 loss: 0.7027 decode.loss_ce: 0.4704 decode.acc_seg: 88.7443 aux.loss_ce: 0.2323 aux.acc_seg: 87.4426 2024/10/25 16:36:15 - mmengine - INFO - Iter(train) [35350/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 0:17:00 time: 1.9444 data_time: 0.0172 memory: 6646 grad_norm: 5.6364 loss: 0.6469 decode.loss_ce: 0.4207 decode.acc_seg: 88.2277 aux.loss_ce: 0.2262 aux.acc_seg: 88.0235 2024/10/25 16:37:53 - mmengine - INFO - Iter(train) [35400/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 0:15:22 time: 1.9424 data_time: 0.0165 memory: 6646 grad_norm: 4.1632 loss: 0.6071 decode.loss_ce: 0.3966 decode.acc_seg: 89.6459 aux.loss_ce: 0.2105 aux.acc_seg: 80.2774 2024/10/25 16:39:30 - mmengine - INFO - Iter(train) [35450/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 0:13:43 time: 1.9466 data_time: 0.0170 memory: 6646 grad_norm: 6.3624 loss: 0.7388 decode.loss_ce: 0.4902 decode.acc_seg: 71.3145 aux.loss_ce: 0.2486 aux.acc_seg: 64.0432 2024/10/25 16:41:08 - mmengine - INFO - Iter(train) [35500/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 0:12:05 time: 1.9405 data_time: 0.0174 memory: 6645 grad_norm: 5.1690 loss: 0.6749 decode.loss_ce: 0.4595 decode.acc_seg: 84.5659 aux.loss_ce: 0.2154 aux.acc_seg: 83.1214 2024/10/25 16:42:45 - mmengine - INFO - Iter(train) [35550/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 0:10:26 time: 1.9608 data_time: 0.0170 memory: 6648 grad_norm: 5.4335 loss: 0.5869 decode.loss_ce: 0.3958 decode.acc_seg: 86.0207 aux.loss_ce: 0.1910 aux.acc_seg: 84.7875 2024/10/25 16:44:23 - mmengine - INFO - Iter(train) [35600/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 0:08:48 time: 1.9542 data_time: 0.0172 memory: 6648 grad_norm: 4.9636 loss: 0.6598 decode.loss_ce: 0.4397 decode.acc_seg: 85.9204 aux.loss_ce: 0.2200 aux.acc_seg: 69.4906 2024/10/25 16:46:00 - mmengine - INFO - Iter(train) [35650/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 0:07:10 time: 1.9481 data_time: 0.0170 memory: 6645 grad_norm: 4.4374 loss: 0.5657 decode.loss_ce: 0.3774 decode.acc_seg: 90.5362 aux.loss_ce: 0.1882 aux.acc_seg: 86.5086 2024/10/25 16:47:38 - mmengine - INFO - Iter(train) [35700/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 0:05:32 time: 1.9659 data_time: 0.0168 memory: 6646 grad_norm: 6.1179 loss: 0.7427 decode.loss_ce: 0.4899 decode.acc_seg: 85.4908 aux.loss_ce: 0.2528 aux.acc_seg: 84.6974 2024/10/25 16:49:16 - mmengine - INFO - Iter(train) [35750/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 0:03:54 time: 1.9473 data_time: 0.0178 memory: 6646 grad_norm: 5.1822 loss: 0.6808 decode.loss_ce: 0.4451 decode.acc_seg: 79.8573 aux.loss_ce: 0.2357 aux.acc_seg: 69.2611 2024/10/25 16:50:55 - mmengine - INFO - Iter(train) [35800/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 0:02:17 time: 1.9465 data_time: 0.0173 memory: 6646 grad_norm: 6.7389 loss: 0.6990 decode.loss_ce: 0.4651 decode.acc_seg: 87.9820 aux.loss_ce: 0.2340 aux.acc_seg: 82.0919 2024/10/25 16:52:32 - mmengine - INFO - Iter(train) [35850/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 1 day, 0:00:38 time: 1.9471 data_time: 0.0171 memory: 6646 grad_norm: 6.5223 loss: 0.6011 decode.loss_ce: 0.3898 decode.acc_seg: 84.8380 aux.loss_ce: 0.2113 aux.acc_seg: 80.1186 2024/10/25 16:54:09 - mmengine - INFO - Iter(train) [35900/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 23:59:00 time: 1.9445 data_time: 0.0159 memory: 6646 grad_norm: 7.0610 loss: 0.6482 decode.loss_ce: 0.4353 decode.acc_seg: 92.4110 aux.loss_ce: 0.2128 aux.acc_seg: 87.1145 2024/10/25 16:55:47 - mmengine - INFO - Iter(train) [35950/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 23:57:22 time: 1.9567 data_time: 0.0171 memory: 6645 grad_norm: 3.2390 loss: 0.6696 decode.loss_ce: 0.4468 decode.acc_seg: 84.6893 aux.loss_ce: 0.2228 aux.acc_seg: 81.1763 2024/10/25 16:57:25 - mmengine - INFO - Exp name: deeplabv3_mobilemamba_b4-80k_ade20k-512x512_20241024_212005 2024/10/25 16:57:25 - mmengine - INFO - Iter(train) [36000/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 23:55:43 time: 1.9465 data_time: 0.0164 memory: 6646 grad_norm: 5.7800 loss: 0.6498 decode.loss_ce: 0.4453 decode.acc_seg: 77.2401 aux.loss_ce: 0.2045 aux.acc_seg: 73.2753 2024/10/25 16:59:02 - mmengine - INFO - Iter(train) [36050/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 23:54:05 time: 1.9474 data_time: 0.0160 memory: 6645 grad_norm: 7.0147 loss: 0.6640 decode.loss_ce: 0.4481 decode.acc_seg: 88.3514 aux.loss_ce: 0.2159 aux.acc_seg: 85.9314 2024/10/25 17:00:43 - mmengine - INFO - Iter(train) [36100/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 23:52:31 time: 1.9697 data_time: 0.0144 memory: 6645 grad_norm: 5.9207 loss: 0.5946 decode.loss_ce: 0.3978 decode.acc_seg: 84.9099 aux.loss_ce: 0.1968 aux.acc_seg: 80.1357 2024/10/25 17:02:20 - mmengine - INFO - Iter(train) [36150/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 23:50:52 time: 1.9464 data_time: 0.0163 memory: 6647 grad_norm: 4.4817 loss: 0.6761 decode.loss_ce: 0.4546 decode.acc_seg: 90.2731 aux.loss_ce: 0.2215 aux.acc_seg: 83.3516 2024/10/25 17:03:58 - mmengine - INFO - Iter(train) [36200/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 23:49:14 time: 1.9475 data_time: 0.0165 memory: 6645 grad_norm: 6.1724 loss: 0.5483 decode.loss_ce: 0.3728 decode.acc_seg: 84.9558 aux.loss_ce: 0.1754 aux.acc_seg: 83.1105 2024/10/25 17:05:36 - mmengine - INFO - Iter(train) [36250/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 23:47:36 time: 1.9476 data_time: 0.0161 memory: 6646 grad_norm: 6.3899 loss: 0.7357 decode.loss_ce: 0.4864 decode.acc_seg: 73.6231 aux.loss_ce: 0.2493 aux.acc_seg: 70.3074 2024/10/25 17:07:13 - mmengine - INFO - Iter(train) [36300/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 23:45:57 time: 1.9452 data_time: 0.0172 memory: 6647 grad_norm: 6.3953 loss: 0.6995 decode.loss_ce: 0.4653 decode.acc_seg: 86.4945 aux.loss_ce: 0.2342 aux.acc_seg: 86.8858 2024/10/25 17:08:51 - mmengine - INFO - Iter(train) [36350/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 23:44:19 time: 1.9448 data_time: 0.0183 memory: 6645 grad_norm: 6.7512 loss: 0.9278 decode.loss_ce: 0.6245 decode.acc_seg: 74.5193 aux.loss_ce: 0.3033 aux.acc_seg: 68.7355 2024/10/25 17:10:28 - mmengine - INFO - Iter(train) [36400/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 23:42:41 time: 1.9521 data_time: 0.0174 memory: 6645 grad_norm: 5.3496 loss: 0.6975 decode.loss_ce: 0.4672 decode.acc_seg: 81.9405 aux.loss_ce: 0.2303 aux.acc_seg: 74.8849 2024/10/25 17:12:06 - mmengine - INFO - Iter(train) [36450/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 23:41:02 time: 1.9517 data_time: 0.0171 memory: 6646 grad_norm: 6.1171 loss: 0.7318 decode.loss_ce: 0.4917 decode.acc_seg: 82.6086 aux.loss_ce: 0.2401 aux.acc_seg: 75.0797 2024/10/25 17:13:43 - mmengine - INFO - Iter(train) [36500/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 23:39:24 time: 1.9450 data_time: 0.0179 memory: 6646 grad_norm: 6.6598 loss: 0.6569 decode.loss_ce: 0.4473 decode.acc_seg: 83.8726 aux.loss_ce: 0.2096 aux.acc_seg: 79.6300 2024/10/25 17:15:21 - mmengine - INFO - Iter(train) [36550/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 23:37:46 time: 1.9465 data_time: 0.0166 memory: 6645 grad_norm: 4.2238 loss: 0.6481 decode.loss_ce: 0.4285 decode.acc_seg: 84.3872 aux.loss_ce: 0.2196 aux.acc_seg: 83.4542 2024/10/25 17:16:58 - mmengine - INFO - Iter(train) [36600/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 23:36:08 time: 1.9537 data_time: 0.0159 memory: 6645 grad_norm: 4.3006 loss: 0.6651 decode.loss_ce: 0.4326 decode.acc_seg: 84.0932 aux.loss_ce: 0.2325 aux.acc_seg: 77.9163 2024/10/25 17:18:36 - mmengine - INFO - Iter(train) [36650/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 23:34:29 time: 1.9601 data_time: 0.0164 memory: 6647 grad_norm: 4.7206 loss: 0.4941 decode.loss_ce: 0.3331 decode.acc_seg: 87.5687 aux.loss_ce: 0.1610 aux.acc_seg: 85.1000 2024/10/25 17:20:13 - mmengine - INFO - Iter(train) [36700/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 23:32:51 time: 1.9484 data_time: 0.0168 memory: 6646 grad_norm: 5.4019 loss: 0.7423 decode.loss_ce: 0.4901 decode.acc_seg: 82.2532 aux.loss_ce: 0.2522 aux.acc_seg: 80.5736 2024/10/25 17:21:51 - mmengine - INFO - Iter(train) [36750/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 23:31:13 time: 1.9468 data_time: 0.0175 memory: 6646 grad_norm: 4.8115 loss: 0.6876 decode.loss_ce: 0.4621 decode.acc_seg: 87.8419 aux.loss_ce: 0.2256 aux.acc_seg: 82.8004 2024/10/25 17:23:28 - mmengine - INFO - Iter(train) [36800/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 23:29:34 time: 1.9449 data_time: 0.0173 memory: 6646 grad_norm: 5.3789 loss: 0.6371 decode.loss_ce: 0.4228 decode.acc_seg: 82.7447 aux.loss_ce: 0.2143 aux.acc_seg: 79.9019 2024/10/25 17:25:06 - mmengine - INFO - Iter(train) [36850/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 23:27:56 time: 1.9486 data_time: 0.0170 memory: 6645 grad_norm: 7.6796 loss: 0.6646 decode.loss_ce: 0.4327 decode.acc_seg: 82.5159 aux.loss_ce: 0.2319 aux.acc_seg: 82.1595 2024/10/25 17:26:44 - mmengine - INFO - Iter(train) [36900/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 23:26:18 time: 1.9497 data_time: 0.0171 memory: 6646 grad_norm: 7.1958 loss: 0.6326 decode.loss_ce: 0.4362 decode.acc_seg: 85.9773 aux.loss_ce: 0.1964 aux.acc_seg: 81.4706 2024/10/25 17:28:21 - mmengine - INFO - Iter(train) [36950/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 23:24:40 time: 1.9537 data_time: 0.0171 memory: 6647 grad_norm: 4.9130 loss: 0.6842 decode.loss_ce: 0.4540 decode.acc_seg: 81.9368 aux.loss_ce: 0.2303 aux.acc_seg: 71.8786 2024/10/25 17:29:59 - mmengine - INFO - Exp name: deeplabv3_mobilemamba_b4-80k_ade20k-512x512_20241024_212005 2024/10/25 17:29:59 - mmengine - INFO - Iter(train) [37000/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 23:23:01 time: 1.9531 data_time: 0.0173 memory: 6645 grad_norm: 5.9284 loss: 0.6551 decode.loss_ce: 0.4390 decode.acc_seg: 81.4522 aux.loss_ce: 0.2161 aux.acc_seg: 78.8246 2024/10/25 17:31:36 - mmengine - INFO - Iter(train) [37050/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 23:21:23 time: 1.9441 data_time: 0.0165 memory: 6646 grad_norm: 7.6089 loss: 0.6661 decode.loss_ce: 0.4489 decode.acc_seg: 90.0229 aux.loss_ce: 0.2172 aux.acc_seg: 89.5318 2024/10/25 17:33:14 - mmengine - INFO - Iter(train) [37100/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 23:19:45 time: 1.9428 data_time: 0.0170 memory: 6645 grad_norm: 4.8977 loss: 0.6735 decode.loss_ce: 0.4536 decode.acc_seg: 79.4106 aux.loss_ce: 0.2199 aux.acc_seg: 77.0627 2024/10/25 17:34:51 - mmengine - INFO - Iter(train) [37150/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 23:18:06 time: 1.9443 data_time: 0.0169 memory: 6646 grad_norm: 4.9599 loss: 0.6394 decode.loss_ce: 0.4273 decode.acc_seg: 85.5582 aux.loss_ce: 0.2121 aux.acc_seg: 87.1761 2024/10/25 17:36:29 - mmengine - INFO - Iter(train) [37200/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 23:16:28 time: 1.9406 data_time: 0.0170 memory: 6645 grad_norm: 4.3857 loss: 0.7076 decode.loss_ce: 0.4820 decode.acc_seg: 82.3929 aux.loss_ce: 0.2256 aux.acc_seg: 87.9231 2024/10/25 17:38:06 - mmengine - INFO - Iter(train) [37250/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 23:14:49 time: 1.9541 data_time: 0.0167 memory: 6645 grad_norm: 3.6902 loss: 0.5719 decode.loss_ce: 0.3795 decode.acc_seg: 79.7097 aux.loss_ce: 0.1925 aux.acc_seg: 72.9908 2024/10/25 17:39:44 - mmengine - INFO - Iter(train) [37300/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 23:13:11 time: 1.9603 data_time: 0.0154 memory: 6645 grad_norm: 6.7990 loss: 0.6606 decode.loss_ce: 0.4475 decode.acc_seg: 85.6097 aux.loss_ce: 0.2131 aux.acc_seg: 86.3929 2024/10/25 17:41:21 - mmengine - INFO - Iter(train) [37350/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 23:11:33 time: 1.9415 data_time: 0.0172 memory: 6646 grad_norm: 5.0948 loss: 0.6009 decode.loss_ce: 0.4047 decode.acc_seg: 84.6336 aux.loss_ce: 0.1962 aux.acc_seg: 76.5297 2024/10/25 17:42:59 - mmengine - INFO - Iter(train) [37400/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 23:09:54 time: 1.9569 data_time: 0.0180 memory: 6645 grad_norm: 5.6737 loss: 0.7737 decode.loss_ce: 0.5079 decode.acc_seg: 75.0789 aux.loss_ce: 0.2658 aux.acc_seg: 68.1406 2024/10/25 17:44:36 - mmengine - INFO - Iter(train) [37450/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 23:08:16 time: 1.9494 data_time: 0.0152 memory: 6645 grad_norm: 4.9034 loss: 0.6964 decode.loss_ce: 0.4623 decode.acc_seg: 90.1737 aux.loss_ce: 0.2341 aux.acc_seg: 86.7675 2024/10/25 17:46:14 - mmengine - INFO - Iter(train) [37500/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 23:06:38 time: 1.9490 data_time: 0.0169 memory: 6647 grad_norm: 4.6199 loss: 0.6992 decode.loss_ce: 0.4658 decode.acc_seg: 80.1196 aux.loss_ce: 0.2334 aux.acc_seg: 75.8502 2024/10/25 17:47:52 - mmengine - INFO - Iter(train) [37550/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 23:05:00 time: 1.9454 data_time: 0.0169 memory: 6646 grad_norm: 4.5040 loss: 0.6536 decode.loss_ce: 0.4456 decode.acc_seg: 88.8248 aux.loss_ce: 0.2081 aux.acc_seg: 88.3127 2024/10/25 17:49:29 - mmengine - INFO - Iter(train) [37600/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 23:03:22 time: 1.9498 data_time: 0.0161 memory: 6646 grad_norm: 4.6477 loss: 0.6922 decode.loss_ce: 0.4518 decode.acc_seg: 82.5794 aux.loss_ce: 0.2404 aux.acc_seg: 80.4460 2024/10/25 17:51:07 - mmengine - INFO - Iter(train) [37650/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 23:01:43 time: 1.9500 data_time: 0.0168 memory: 6645 grad_norm: 5.0172 loss: 0.6738 decode.loss_ce: 0.4600 decode.acc_seg: 70.1898 aux.loss_ce: 0.2138 aux.acc_seg: 64.7645 2024/10/25 17:52:45 - mmengine - INFO - Iter(train) [37700/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 23:00:05 time: 1.9507 data_time: 0.0169 memory: 6645 grad_norm: 6.5794 loss: 0.6170 decode.loss_ce: 0.4193 decode.acc_seg: 84.2782 aux.loss_ce: 0.1977 aux.acc_seg: 82.6265 2024/10/25 17:54:22 - mmengine - INFO - Iter(train) [37750/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 22:58:27 time: 1.9486 data_time: 0.0171 memory: 6646 grad_norm: 5.4191 loss: 0.6260 decode.loss_ce: 0.4220 decode.acc_seg: 83.4102 aux.loss_ce: 0.2040 aux.acc_seg: 74.9779 2024/10/25 17:56:00 - mmengine - INFO - Iter(train) [37800/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 22:56:49 time: 1.9567 data_time: 0.0169 memory: 6645 grad_norm: 5.8820 loss: 0.6831 decode.loss_ce: 0.4540 decode.acc_seg: 84.1824 aux.loss_ce: 0.2291 aux.acc_seg: 69.0951 2024/10/25 17:57:37 - mmengine - INFO - Iter(train) [37850/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 22:55:10 time: 1.9432 data_time: 0.0170 memory: 6645 grad_norm: 6.1153 loss: 0.7058 decode.loss_ce: 0.4805 decode.acc_seg: 76.0445 aux.loss_ce: 0.2254 aux.acc_seg: 73.4250 2024/10/25 17:59:15 - mmengine - INFO - Iter(train) [37900/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 22:53:32 time: 1.9482 data_time: 0.0170 memory: 6645 grad_norm: 5.3853 loss: 0.6511 decode.loss_ce: 0.4299 decode.acc_seg: 82.5879 aux.loss_ce: 0.2212 aux.acc_seg: 79.9047 2024/10/25 18:00:52 - mmengine - INFO - Iter(train) [37950/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 22:51:54 time: 1.9424 data_time: 0.0167 memory: 6645 grad_norm: 5.4363 loss: 0.6847 decode.loss_ce: 0.4672 decode.acc_seg: 81.6767 aux.loss_ce: 0.2175 aux.acc_seg: 79.9153 2024/10/25 18:02:30 - mmengine - INFO - Exp name: deeplabv3_mobilemamba_b4-80k_ade20k-512x512_20241024_212005 2024/10/25 18:02:30 - mmengine - INFO - Iter(train) [38000/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 22:50:15 time: 1.9542 data_time: 0.0163 memory: 6645 grad_norm: 6.4871 loss: 0.6862 decode.loss_ce: 0.4521 decode.acc_seg: 82.7414 aux.loss_ce: 0.2342 aux.acc_seg: 77.4513 2024/10/25 18:04:07 - mmengine - INFO - Iter(train) [38050/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 22:48:37 time: 1.9460 data_time: 0.0155 memory: 6645 grad_norm: 5.1688 loss: 0.6422 decode.loss_ce: 0.4318 decode.acc_seg: 79.8968 aux.loss_ce: 0.2105 aux.acc_seg: 73.9051 2024/10/25 18:05:44 - mmengine - INFO - Iter(train) [38100/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 22:46:59 time: 1.9453 data_time: 0.0176 memory: 6647 grad_norm: 5.2517 loss: 0.6208 decode.loss_ce: 0.4105 decode.acc_seg: 77.8792 aux.loss_ce: 0.2103 aux.acc_seg: 74.1997 2024/10/25 18:07:22 - mmengine - INFO - Iter(train) [38150/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 22:45:21 time: 1.9479 data_time: 0.0170 memory: 6646 grad_norm: 6.3782 loss: 0.8013 decode.loss_ce: 0.5319 decode.acc_seg: 77.6224 aux.loss_ce: 0.2694 aux.acc_seg: 67.7019 2024/10/25 18:09:00 - mmengine - INFO - Iter(train) [38200/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 22:43:43 time: 1.9529 data_time: 0.0181 memory: 6645 grad_norm: 5.0185 loss: 0.6682 decode.loss_ce: 0.4395 decode.acc_seg: 79.5893 aux.loss_ce: 0.2287 aux.acc_seg: 75.1395 2024/10/25 18:10:37 - mmengine - INFO - Iter(train) [38250/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 22:42:04 time: 1.9567 data_time: 0.0162 memory: 6645 grad_norm: 6.9841 loss: 0.6788 decode.loss_ce: 0.4556 decode.acc_seg: 79.1718 aux.loss_ce: 0.2232 aux.acc_seg: 75.2351 2024/10/25 18:12:15 - mmengine - INFO - Iter(train) [38300/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 22:40:26 time: 1.9509 data_time: 0.0163 memory: 6646 grad_norm: 4.8931 loss: 0.5546 decode.loss_ce: 0.3624 decode.acc_seg: 85.1242 aux.loss_ce: 0.1922 aux.acc_seg: 83.0070 2024/10/25 18:13:53 - mmengine - INFO - Iter(train) [38350/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 22:38:48 time: 1.9460 data_time: 0.0177 memory: 6645 grad_norm: 5.6817 loss: 0.6069 decode.loss_ce: 0.4105 decode.acc_seg: 80.7499 aux.loss_ce: 0.1965 aux.acc_seg: 82.5702 2024/10/25 18:15:30 - mmengine - INFO - Iter(train) [38400/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 22:37:10 time: 1.9496 data_time: 0.0166 memory: 6646 grad_norm: 4.7961 loss: 0.6015 decode.loss_ce: 0.4099 decode.acc_seg: 83.5410 aux.loss_ce: 0.1916 aux.acc_seg: 83.7821 2024/10/25 18:17:08 - mmengine - INFO - Iter(train) [38450/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 22:35:32 time: 1.9675 data_time: 0.0168 memory: 6644 grad_norm: 6.7640 loss: 0.6160 decode.loss_ce: 0.4023 decode.acc_seg: 86.8734 aux.loss_ce: 0.2136 aux.acc_seg: 80.7456 2024/10/25 18:18:45 - mmengine - INFO - Iter(train) [38500/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 22:33:53 time: 1.9489 data_time: 0.0173 memory: 6645 grad_norm: 5.4988 loss: 0.7162 decode.loss_ce: 0.4669 decode.acc_seg: 73.7481 aux.loss_ce: 0.2493 aux.acc_seg: 74.7919 2024/10/25 18:20:24 - mmengine - INFO - Iter(train) [38550/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 22:32:16 time: 1.9559 data_time: 0.0161 memory: 6646 grad_norm: 8.8690 loss: 0.6017 decode.loss_ce: 0.4104 decode.acc_seg: 80.7538 aux.loss_ce: 0.1913 aux.acc_seg: 81.9364 2024/10/25 18:22:01 - mmengine - INFO - Iter(train) [38600/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 22:30:38 time: 1.9535 data_time: 0.0173 memory: 6645 grad_norm: 5.6725 loss: 0.7763 decode.loss_ce: 0.5182 decode.acc_seg: 88.6433 aux.loss_ce: 0.2580 aux.acc_seg: 82.4869 2024/10/25 18:23:39 - mmengine - INFO - Iter(train) [38650/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 22:29:00 time: 1.9434 data_time: 0.0168 memory: 6646 grad_norm: 5.1545 loss: 0.6281 decode.loss_ce: 0.4142 decode.acc_seg: 85.7172 aux.loss_ce: 0.2139 aux.acc_seg: 87.8826 2024/10/25 18:25:17 - mmengine - INFO - Iter(train) [38700/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 22:27:22 time: 1.9529 data_time: 0.0174 memory: 6646 grad_norm: 9.0579 loss: 0.6286 decode.loss_ce: 0.4281 decode.acc_seg: 84.4619 aux.loss_ce: 0.2006 aux.acc_seg: 77.4622 2024/10/25 18:26:54 - mmengine - INFO - Iter(train) [38750/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 22:25:43 time: 1.9385 data_time: 0.0170 memory: 6645 grad_norm: 5.0082 loss: 0.6058 decode.loss_ce: 0.3991 decode.acc_seg: 80.0893 aux.loss_ce: 0.2067 aux.acc_seg: 74.5919 2024/10/25 18:28:31 - mmengine - INFO - Iter(train) [38800/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 22:24:05 time: 1.9460 data_time: 0.0163 memory: 6647 grad_norm: 5.2046 loss: 0.7039 decode.loss_ce: 0.4652 decode.acc_seg: 82.6927 aux.loss_ce: 0.2387 aux.acc_seg: 80.2895 2024/10/25 18:30:09 - mmengine - INFO - Iter(train) [38850/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 22:22:26 time: 1.9452 data_time: 0.0164 memory: 6645 grad_norm: 5.2942 loss: 0.6671 decode.loss_ce: 0.4498 decode.acc_seg: 87.3280 aux.loss_ce: 0.2173 aux.acc_seg: 82.2834 2024/10/25 18:31:46 - mmengine - INFO - Iter(train) [38900/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 22:20:48 time: 1.9419 data_time: 0.0175 memory: 6647 grad_norm: 5.9658 loss: 0.6327 decode.loss_ce: 0.4156 decode.acc_seg: 84.3743 aux.loss_ce: 0.2171 aux.acc_seg: 81.7377 2024/10/25 18:33:24 - mmengine - INFO - Iter(train) [38950/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 22:19:10 time: 1.9459 data_time: 0.0162 memory: 6646 grad_norm: 5.0823 loss: 0.6528 decode.loss_ce: 0.4398 decode.acc_seg: 76.2528 aux.loss_ce: 0.2130 aux.acc_seg: 71.7790 2024/10/25 18:35:01 - mmengine - INFO - Exp name: deeplabv3_mobilemamba_b4-80k_ade20k-512x512_20241024_212005 2024/10/25 18:35:01 - mmengine - INFO - Iter(train) [39000/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 22:17:32 time: 1.9489 data_time: 0.0166 memory: 6646 grad_norm: 5.4698 loss: 0.6149 decode.loss_ce: 0.4090 decode.acc_seg: 80.8033 aux.loss_ce: 0.2059 aux.acc_seg: 82.1770 2024/10/25 18:36:43 - mmengine - INFO - Iter(train) [39050/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 22:15:58 time: 1.9649 data_time: 0.0171 memory: 6645 grad_norm: 6.0833 loss: 0.6473 decode.loss_ce: 0.4318 decode.acc_seg: 87.9064 aux.loss_ce: 0.2155 aux.acc_seg: 88.4866 2024/10/25 18:38:21 - mmengine - INFO - Iter(train) [39100/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 22:14:20 time: 1.9487 data_time: 0.0181 memory: 6646 grad_norm: 3.4789 loss: 0.5533 decode.loss_ce: 0.3702 decode.acc_seg: 86.8621 aux.loss_ce: 0.1830 aux.acc_seg: 82.6835 2024/10/25 18:39:58 - mmengine - INFO - Iter(train) [39150/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 22:12:41 time: 1.9424 data_time: 0.0174 memory: 6645 grad_norm: 7.1552 loss: 0.7108 decode.loss_ce: 0.4840 decode.acc_seg: 86.8482 aux.loss_ce: 0.2269 aux.acc_seg: 84.3799 2024/10/25 18:41:35 - mmengine - INFO - Iter(train) [39200/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 22:11:03 time: 1.9515 data_time: 0.0171 memory: 6646 grad_norm: 6.3306 loss: 0.6401 decode.loss_ce: 0.4257 decode.acc_seg: 86.2019 aux.loss_ce: 0.2144 aux.acc_seg: 82.6715 2024/10/25 18:43:13 - mmengine - INFO - Iter(train) [39250/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 22:09:24 time: 1.9428 data_time: 0.0167 memory: 6646 grad_norm: 5.6945 loss: 0.7439 decode.loss_ce: 0.4899 decode.acc_seg: 88.7022 aux.loss_ce: 0.2541 aux.acc_seg: 87.0409 2024/10/25 18:44:50 - mmengine - INFO - Iter(train) [39300/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 22:07:46 time: 1.9518 data_time: 0.0156 memory: 6645 grad_norm: 5.7225 loss: 0.6217 decode.loss_ce: 0.4196 decode.acc_seg: 84.5286 aux.loss_ce: 0.2021 aux.acc_seg: 79.4476 2024/10/25 18:46:28 - mmengine - INFO - Iter(train) [39350/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 22:06:08 time: 1.9520 data_time: 0.0161 memory: 6646 grad_norm: 5.9692 loss: 0.7155 decode.loss_ce: 0.4692 decode.acc_seg: 87.0771 aux.loss_ce: 0.2463 aux.acc_seg: 76.5313 2024/10/25 18:48:05 - mmengine - INFO - Iter(train) [39400/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 22:04:29 time: 1.9428 data_time: 0.0163 memory: 6645 grad_norm: 5.8523 loss: 0.6222 decode.loss_ce: 0.3893 decode.acc_seg: 82.6818 aux.loss_ce: 0.2329 aux.acc_seg: 82.7270 2024/10/25 18:49:44 - mmengine - INFO - Iter(train) [39450/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 22:02:52 time: 1.9369 data_time: 0.0172 memory: 6645 grad_norm: 5.4067 loss: 0.7348 decode.loss_ce: 0.4960 decode.acc_seg: 88.9084 aux.loss_ce: 0.2388 aux.acc_seg: 82.2549 2024/10/25 18:51:21 - mmengine - INFO - Iter(train) [39500/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 22:01:14 time: 1.9519 data_time: 0.0157 memory: 6645 grad_norm: 7.6142 loss: 0.6244 decode.loss_ce: 0.4190 decode.acc_seg: 81.4812 aux.loss_ce: 0.2054 aux.acc_seg: 76.5561 2024/10/25 18:52:59 - mmengine - INFO - Iter(train) [39550/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 21:59:36 time: 1.9554 data_time: 0.0159 memory: 6645 grad_norm: 5.5835 loss: 0.5494 decode.loss_ce: 0.3713 decode.acc_seg: 79.9137 aux.loss_ce: 0.1780 aux.acc_seg: 79.3275 2024/10/25 18:54:37 - mmengine - INFO - Iter(train) [39600/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 21:57:58 time: 1.9444 data_time: 0.0169 memory: 6645 grad_norm: 5.5405 loss: 0.6211 decode.loss_ce: 0.4053 decode.acc_seg: 82.5551 aux.loss_ce: 0.2158 aux.acc_seg: 77.6118 2024/10/25 18:56:14 - mmengine - INFO - Iter(train) [39650/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 21:56:19 time: 1.9467 data_time: 0.0156 memory: 6647 grad_norm: 7.1432 loss: 0.6309 decode.loss_ce: 0.4227 decode.acc_seg: 82.3765 aux.loss_ce: 0.2082 aux.acc_seg: 80.9050 2024/10/25 18:57:52 - mmengine - INFO - Iter(train) [39700/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 21:54:41 time: 1.9685 data_time: 0.0163 memory: 6647 grad_norm: 4.1776 loss: 0.6370 decode.loss_ce: 0.4244 decode.acc_seg: 76.3859 aux.loss_ce: 0.2126 aux.acc_seg: 77.9814 2024/10/25 18:59:29 - mmengine - INFO - Iter(train) [39750/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 21:53:03 time: 1.9464 data_time: 0.0169 memory: 6645 grad_norm: 6.2459 loss: 0.6258 decode.loss_ce: 0.4168 decode.acc_seg: 86.2748 aux.loss_ce: 0.2090 aux.acc_seg: 84.6653 2024/10/25 19:01:07 - mmengine - INFO - Iter(train) [39800/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 21:51:25 time: 1.9493 data_time: 0.0166 memory: 6645 grad_norm: 5.7821 loss: 0.6766 decode.loss_ce: 0.4623 decode.acc_seg: 85.2273 aux.loss_ce: 0.2142 aux.acc_seg: 87.3956 2024/10/25 19:02:44 - mmengine - INFO - Iter(train) [39850/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 21:49:47 time: 1.9446 data_time: 0.0162 memory: 6645 grad_norm: 4.3635 loss: 0.6552 decode.loss_ce: 0.4408 decode.acc_seg: 83.2341 aux.loss_ce: 0.2144 aux.acc_seg: 80.9827 2024/10/25 19:04:22 - mmengine - INFO - Iter(train) [39900/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 21:48:09 time: 1.9637 data_time: 0.0153 memory: 6644 grad_norm: 6.9966 loss: 0.6102 decode.loss_ce: 0.4106 decode.acc_seg: 72.5435 aux.loss_ce: 0.1996 aux.acc_seg: 73.2739 2024/10/25 19:06:00 - mmengine - INFO - Iter(train) [39950/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 21:46:31 time: 1.9444 data_time: 0.0170 memory: 6646 grad_norm: 4.9197 loss: 0.6762 decode.loss_ce: 0.4387 decode.acc_seg: 86.1186 aux.loss_ce: 0.2375 aux.acc_seg: 83.7125 2024/10/25 19:07:37 - mmengine - INFO - Exp name: deeplabv3_mobilemamba_b4-80k_ade20k-512x512_20241024_212005 2024/10/25 19:07:37 - mmengine - INFO - Iter(train) [40000/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 21:44:52 time: 1.9612 data_time: 0.0163 memory: 6646 grad_norm: 6.6648 loss: 0.7545 decode.loss_ce: 0.4978 decode.acc_seg: 82.6166 aux.loss_ce: 0.2567 aux.acc_seg: 78.3329 2024/10/25 19:07:37 - mmengine - INFO - Saving checkpoint at 40000 iterations 2024/10/25 19:07:42 - mmengine - INFO - Iter(val) [ 50/500] eta: 0:00:15 time: 0.0333 data_time: 0.0018 memory: 1049 2024/10/25 19:07:43 - mmengine - INFO - Iter(val) [100/500] eta: 0:00:13 time: 0.0327 data_time: 0.0017 memory: 1117 2024/10/25 19:07:45 - mmengine - INFO - Iter(val) [150/500] eta: 0:00:11 time: 0.0334 data_time: 0.0017 memory: 833 2024/10/25 19:07:47 - mmengine - INFO - Iter(val) [200/500] eta: 0:00:10 time: 0.0331 data_time: 0.0016 memory: 866 2024/10/25 19:07:48 - mmengine - INFO - Iter(val) [250/500] eta: 0:00:08 time: 0.0333 data_time: 0.0018 memory: 906 2024/10/25 19:07:50 - mmengine - INFO - Iter(val) [300/500] eta: 0:00:06 time: 0.0331 data_time: 0.0020 memory: 2028 2024/10/25 19:07:52 - mmengine - INFO - Iter(val) [350/500] eta: 0:00:05 time: 0.0324 data_time: 0.0017 memory: 832 2024/10/25 19:07:53 - mmengine - INFO - Iter(val) [400/500] eta: 0:00:03 time: 0.0335 data_time: 0.0017 memory: 904 2024/10/25 19:07:55 - mmengine - INFO - Iter(val) [450/500] eta: 0:00:01 time: 0.0320 data_time: 0.0015 memory: 839 2024/10/25 19:07:56 - mmengine - INFO - Iter(val) [500/500] eta: 0:00:00 time: 0.0313 data_time: 0.0014 memory: 889 2024/10/25 19:07:58 - mmengine - INFO - per class results: 2024/10/25 19:07:58 - mmengine - INFO - +---------------------+-------+-------+ | Class | IoU | Acc | +---------------------+-------+-------+ | wall | 66.09 | 84.08 | | building | 75.57 | 87.94 | | sky | 88.82 | 94.42 | | floor | 69.56 | 84.83 | | tree | 66.88 | 82.85 | | ceiling | 75.87 | 86.01 | | road | 76.38 | 85.23 | | bed | 78.07 | 88.83 | | windowpane | 50.71 | 67.11 | | grass | 55.2 | 72.08 | | cabinet | 49.44 | 62.65 | | sidewalk | 55.36 | 71.68 | | person | 58.61 | 76.71 | | earth | 25.84 | 34.7 | | door | 33.5 | 48.88 | | table | 41.67 | 61.3 | | mountain | 54.31 | 71.83 | | plant | 40.15 | 48.89 | | curtain | 54.42 | 68.63 | | chair | 37.58 | 51.99 | | car | 68.32 | 84.0 | | water | 45.79 | 57.34 | | painting | 51.32 | 62.41 | | sofa | 53.89 | 68.49 | | shelf | 27.26 | 37.35 | | house | 37.18 | 51.9 | | sea | 37.65 | 57.46 | | mirror | 47.33 | 53.63 | | rug | 46.98 | 61.45 | | field | 22.19 | 47.59 | | armchair | 31.6 | 45.59 | | seat | 50.48 | 66.8 | | fence | 36.05 | 58.67 | | desk | 37.58 | 51.81 | | rock | 29.07 | 60.11 | | wardrobe | 38.88 | 62.81 | | lamp | 29.04 | 36.1 | | bathtub | 67.96 | 80.59 | | railing | 26.42 | 37.86 | | cushion | 31.24 | 41.54 | | base | 15.6 | 25.7 | | box | 11.8 | 16.7 | | column | 25.5 | 31.77 | | signboard | 17.91 | 22.17 | | chest of drawers | 34.18 | 58.47 | | counter | 27.44 | 33.42 | | sand | 26.23 | 47.99 | | sink | 48.69 | 62.5 | | skyscraper | 54.9 | 69.4 | | fireplace | 63.16 | 75.37 | | refrigerator | 63.25 | 76.36 | | grandstand | 36.47 | 60.63 | | path | 15.17 | 26.83 | | stairs | 28.48 | 36.95 | | runway | 66.89 | 94.39 | | case | 47.36 | 61.21 | | pool table | 75.71 | 81.55 | | pillow | 33.88 | 40.29 | | screen door | 54.6 | 79.1 | | stairway | 22.88 | 27.9 | | river | 12.69 | 23.36 | | bridge | 51.91 | 62.04 | | bookcase | 25.3 | 36.65 | | blind | 32.96 | 40.95 | | coffee table | 45.84 | 61.26 | | toilet | 60.93 | 77.21 | | flower | 21.24 | 38.31 | | book | 31.61 | 45.57 | | hill | 4.73 | 6.88 | | bench | 33.01 | 42.41 | | countertop | 40.94 | 54.7 | | stove | 54.25 | 64.85 | | palm | 33.65 | 58.71 | | kitchen island | 13.8 | 19.73 | | computer | 40.46 | 49.11 | | swivel chair | 34.95 | 52.59 | | boat | 51.2 | 77.38 | | bar | 24.61 | 31.98 | | arcade machine | 19.55 | 23.47 | | hovel | 29.47 | 35.89 | | bus | 75.07 | 80.2 | | towel | 30.72 | 54.18 | | light | 11.69 | 13.28 | | truck | 19.76 | 28.24 | | tower | 35.56 | 68.49 | | chandelier | 45.26 | 58.76 | | awning | 16.87 | 20.99 | | streetlight | 2.97 | 3.48 | | booth | 51.24 | 53.77 | | television receiver | 42.01 | 46.26 | | airplane | 30.29 | 44.79 | | dirt track | 2.65 | 19.69 | | apparel | 26.65 | 40.81 | | pole | 5.87 | 7.32 | | land | 0.0 | 0.0 | | bannister | 0.37 | 0.42 | | escalator | 40.0 | 62.79 | | ottoman | 25.0 | 44.59 | | bottle | 18.16 | 24.67 | | buffet | 36.4 | 37.82 | | poster | 27.67 | 35.25 | | stage | 12.15 | 23.52 | | van | 31.84 | 37.87 | | ship | 52.12 | 66.88 | | fountain | 17.25 | 17.66 | | conveyer belt | 45.22 | 85.38 | | canopy | 14.19 | 18.51 | | washer | 56.9 | 67.01 | | plaything | 8.5 | 10.86 | | swimming pool | 42.73 | 51.82 | | stool | 20.65 | 27.86 | | barrel | 39.41 | 59.35 | | basket | 10.71 | 19.67 | | waterfall | 31.56 | 38.9 | | tent | 83.16 | 94.4 | | bag | 5.39 | 7.32 | | minibike | 44.44 | 72.08 | | cradle | 36.09 | 71.3 | | oven | 40.43 | 61.45 | | ball | 26.07 | 30.63 | | food | 42.34 | 48.19 | | step | 6.79 | 7.62 | | tank | 47.52 | 51.39 | | trade name | 11.57 | 12.61 | | microwave | 53.81 | 59.8 | | pot | 19.96 | 23.66 | | animal | 39.44 | 49.79 | | bicycle | 31.69 | 49.37 | | lake | 16.91 | 30.72 | | dishwasher | 41.24 | 55.45 | | screen | 58.17 | 78.69 | | blanket | 3.72 | 4.17 | | sculpture | 31.26 | 39.02 | | hood | 37.51 | 42.59 | | sconce | 15.87 | 19.32 | | vase | 13.72 | 19.25 | | traffic light | 7.9 | 11.6 | | tray | 1.5 | 2.31 | | ashcan | 17.13 | 32.63 | | fan | 26.5 | 35.17 | | pier | 47.62 | 83.84 | | crt screen | 0.0 | 0.0 | | plate | 28.75 | 37.69 | | monitor | 5.35 | 5.54 | | bulletin board | 28.85 | 32.12 | | shower | 0.0 | 0.0 | | radiator | 35.33 | 39.92 | | glass | 2.21 | 2.39 | | clock | 0.73 | 0.73 | | flag | 25.92 | 28.28 | +---------------------+-------+-------+ 2024/10/25 19:07:58 - mmengine - INFO - Iter(val) [500/500] aAcc: 75.1000 mIoU: 35.1100 mAcc: 46.6500 data_time: 0.0018 time: 0.0332 2024/10/25 19:09:35 - mmengine - INFO - Iter(train) [40050/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 21:43:15 time: 1.9464 data_time: 0.0193 memory: 6646 grad_norm: 6.5987 loss: 0.6709 decode.loss_ce: 0.4419 decode.acc_seg: 86.2128 aux.loss_ce: 0.2290 aux.acc_seg: 84.3794 2024/10/25 19:11:13 - mmengine - INFO - Iter(train) [40100/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 21:41:37 time: 1.9522 data_time: 0.0184 memory: 6645 grad_norm: 8.2537 loss: 0.6517 decode.loss_ce: 0.4386 decode.acc_seg: 79.1205 aux.loss_ce: 0.2131 aux.acc_seg: 79.1122 2024/10/25 19:12:50 - mmengine - INFO - Iter(train) [40150/80000] base_lr: 1.2000e-04 lr: 1.2000e-04 eta: 21:39:59 time: 1.9475 data_time: 0.0168 memory: 6645 grad_norm: 5.8138 loss: 0.5392 decode.loss_ce: 0.3553 decode.acc_seg: 80.9413 aux.loss_ce: 0.1839 aux.acc_seg: 78.6354 2024/10/25 19:14:28 - mmengine - INFO - Iter(train) [40200/80000] base_lr: 1.1999e-04 lr: 1.1999e-04 eta: 21:38:21 time: 1.9560 data_time: 0.0178 memory: 6646 grad_norm: 5.2273 loss: 0.6921 decode.loss_ce: 0.4578 decode.acc_seg: 84.2184 aux.loss_ce: 0.2342 aux.acc_seg: 82.7618 2024/10/25 19:16:06 - mmengine - INFO - Iter(train) [40250/80000] base_lr: 1.1999e-04 lr: 1.1999e-04 eta: 21:36:43 time: 1.9496 data_time: 0.0175 memory: 6646 grad_norm: 4.8783 loss: 0.6357 decode.loss_ce: 0.4229 decode.acc_seg: 84.0878 aux.loss_ce: 0.2129 aux.acc_seg: 81.4307 2024/10/25 19:17:44 - mmengine - INFO - Iter(train) [40300/80000] base_lr: 1.1998e-04 lr: 1.1998e-04 eta: 21:35:05 time: 1.9542 data_time: 0.0191 memory: 6645 grad_norm: 4.7531 loss: 0.6020 decode.loss_ce: 0.3966 decode.acc_seg: 84.8649 aux.loss_ce: 0.2054 aux.acc_seg: 77.1309 2024/10/25 19:19:22 - mmengine - INFO - Iter(train) [40350/80000] base_lr: 1.1998e-04 lr: 1.1998e-04 eta: 21:33:28 time: 1.9653 data_time: 0.0174 memory: 6645 grad_norm: 5.9417 loss: 0.6809 decode.loss_ce: 0.4655 decode.acc_seg: 78.3110 aux.loss_ce: 0.2154 aux.acc_seg: 76.4812 2024/10/25 19:21:00 - mmengine - INFO - Iter(train) [40400/80000] base_lr: 1.1997e-04 lr: 1.1997e-04 eta: 21:31:50 time: 1.9483 data_time: 0.0174 memory: 6648 grad_norm: 9.4529 loss: 0.8319 decode.loss_ce: 0.5720 decode.acc_seg: 86.1511 aux.loss_ce: 0.2599 aux.acc_seg: 86.0671 2024/10/25 19:22:37 - mmengine - INFO - Iter(train) [40450/80000] base_lr: 1.1996e-04 lr: 1.1996e-04 eta: 21:30:11 time: 1.9456 data_time: 0.0189 memory: 6646 grad_norm: 4.5762 loss: 0.5907 decode.loss_ce: 0.4054 decode.acc_seg: 80.4750 aux.loss_ce: 0.1853 aux.acc_seg: 77.9757 2024/10/25 19:24:14 - mmengine - INFO - Iter(train) [40500/80000] base_lr: 1.1995e-04 lr: 1.1995e-04 eta: 21:28:33 time: 1.9450 data_time: 0.0171 memory: 6645 grad_norm: 6.1942 loss: 0.5955 decode.loss_ce: 0.3993 decode.acc_seg: 82.5404 aux.loss_ce: 0.1962 aux.acc_seg: 83.1094 2024/10/25 19:25:52 - mmengine - INFO - Iter(train) [40550/80000] base_lr: 1.1994e-04 lr: 1.1994e-04 eta: 21:26:55 time: 1.9526 data_time: 0.0188 memory: 6645 grad_norm: 5.4809 loss: 0.5363 decode.loss_ce: 0.3629 decode.acc_seg: 84.0606 aux.loss_ce: 0.1734 aux.acc_seg: 79.9500 2024/10/25 19:27:30 - mmengine - INFO - Iter(train) [40600/80000] base_lr: 1.1993e-04 lr: 1.1993e-04 eta: 21:25:17 time: 1.9368 data_time: 0.0178 memory: 6646 grad_norm: 4.4270 loss: 0.5806 decode.loss_ce: 0.3857 decode.acc_seg: 82.7528 aux.loss_ce: 0.1949 aux.acc_seg: 73.1262 2024/10/25 19:29:07 - mmengine - INFO - Iter(train) [40650/80000] base_lr: 1.1992e-04 lr: 1.1992e-04 eta: 21:23:38 time: 1.9465 data_time: 0.0181 memory: 6645 grad_norm: 5.3769 loss: 0.6263 decode.loss_ce: 0.4235 decode.acc_seg: 85.7910 aux.loss_ce: 0.2028 aux.acc_seg: 84.5448 2024/10/25 19:30:44 - mmengine - INFO - Iter(train) [40700/80000] base_lr: 1.1991e-04 lr: 1.1991e-04 eta: 21:22:00 time: 1.9423 data_time: 0.0186 memory: 6644 grad_norm: 7.7961 loss: 0.6591 decode.loss_ce: 0.4452 decode.acc_seg: 86.7114 aux.loss_ce: 0.2139 aux.acc_seg: 80.5114 2024/10/25 19:32:22 - mmengine - INFO - Iter(train) [40750/80000] base_lr: 1.1990e-04 lr: 1.1990e-04 eta: 21:20:22 time: 1.9493 data_time: 0.0188 memory: 6645 grad_norm: 4.7180 loss: 0.6452 decode.loss_ce: 0.4355 decode.acc_seg: 84.9875 aux.loss_ce: 0.2097 aux.acc_seg: 86.0254 2024/10/25 19:34:00 - mmengine - INFO - Iter(train) [40800/80000] base_lr: 1.1988e-04 lr: 1.1988e-04 eta: 21:18:44 time: 1.9432 data_time: 0.0184 memory: 6645 grad_norm: 6.1226 loss: 0.7780 decode.loss_ce: 0.5005 decode.acc_seg: 88.5642 aux.loss_ce: 0.2776 aux.acc_seg: 84.9203 2024/10/25 19:35:37 - mmengine - INFO - Iter(train) [40850/80000] base_lr: 1.1987e-04 lr: 1.1987e-04 eta: 21:17:06 time: 1.9520 data_time: 0.0179 memory: 6646 grad_norm: 4.5049 loss: 0.5714 decode.loss_ce: 0.3828 decode.acc_seg: 79.1806 aux.loss_ce: 0.1886 aux.acc_seg: 76.8934 2024/10/25 19:37:15 - mmengine - INFO - Iter(train) [40900/80000] base_lr: 1.1985e-04 lr: 1.1985e-04 eta: 21:15:28 time: 1.9469 data_time: 0.0180 memory: 6645 grad_norm: 8.9447 loss: 0.5793 decode.loss_ce: 0.4053 decode.acc_seg: 85.6816 aux.loss_ce: 0.1740 aux.acc_seg: 86.5209 2024/10/25 19:38:53 - mmengine - INFO - Iter(train) [40950/80000] base_lr: 1.1983e-04 lr: 1.1983e-04 eta: 21:13:50 time: 1.9523 data_time: 0.0180 memory: 6646 grad_norm: 6.4874 loss: 0.6076 decode.loss_ce: 0.4003 decode.acc_seg: 89.4144 aux.loss_ce: 0.2074 aux.acc_seg: 88.3736 2024/10/25 19:40:31 - mmengine - INFO - Exp name: deeplabv3_mobilemamba_b4-80k_ade20k-512x512_20241024_212005 2024/10/25 19:40:31 - mmengine - INFO - Iter(train) [41000/80000] base_lr: 1.1982e-04 lr: 1.1982e-04 eta: 21:12:12 time: 1.9532 data_time: 0.0185 memory: 6645 grad_norm: 4.8927 loss: 0.6662 decode.loss_ce: 0.4328 decode.acc_seg: 87.6969 aux.loss_ce: 0.2334 aux.acc_seg: 74.5583 2024/10/25 19:42:08 - mmengine - INFO - Iter(train) [41050/80000] base_lr: 1.1980e-04 lr: 1.1980e-04 eta: 21:10:34 time: 1.9624 data_time: 0.0167 memory: 6645 grad_norm: 5.0083 loss: 0.6200 decode.loss_ce: 0.4084 decode.acc_seg: 85.6966 aux.loss_ce: 0.2116 aux.acc_seg: 82.2208 2024/10/25 19:43:46 - mmengine - INFO - Iter(train) [41100/80000] base_lr: 1.1978e-04 lr: 1.1978e-04 eta: 21:08:56 time: 1.9618 data_time: 0.0167 memory: 6646 grad_norm: 6.0724 loss: 0.5851 decode.loss_ce: 0.3887 decode.acc_seg: 86.7716 aux.loss_ce: 0.1965 aux.acc_seg: 88.6419 2024/10/25 19:45:23 - mmengine - INFO - Iter(train) [41150/80000] base_lr: 1.1976e-04 lr: 1.1976e-04 eta: 21:07:18 time: 1.9491 data_time: 0.0180 memory: 6646 grad_norm: 6.9035 loss: 0.6508 decode.loss_ce: 0.4446 decode.acc_seg: 87.7757 aux.loss_ce: 0.2062 aux.acc_seg: 85.3558 2024/10/25 19:47:01 - mmengine - INFO - Iter(train) [41200/80000] base_lr: 1.1973e-04 lr: 1.1973e-04 eta: 21:05:40 time: 1.9513 data_time: 0.0181 memory: 6647 grad_norm: 5.1024 loss: 0.6900 decode.loss_ce: 0.4714 decode.acc_seg: 85.5257 aux.loss_ce: 0.2185 aux.acc_seg: 84.7097 2024/10/25 19:48:39 - mmengine - INFO - Iter(train) [41250/80000] base_lr: 1.1971e-04 lr: 1.1971e-04 eta: 21:04:02 time: 1.9493 data_time: 0.0166 memory: 6646 grad_norm: 4.2369 loss: 0.6068 decode.loss_ce: 0.4108 decode.acc_seg: 78.1674 aux.loss_ce: 0.1961 aux.acc_seg: 74.3685 2024/10/25 19:50:16 - mmengine - INFO - Iter(train) [41300/80000] base_lr: 1.1969e-04 lr: 1.1969e-04 eta: 21:02:23 time: 1.9478 data_time: 0.0182 memory: 6646 grad_norm: 4.8627 loss: 0.6683 decode.loss_ce: 0.4526 decode.acc_seg: 68.3726 aux.loss_ce: 0.2157 aux.acc_seg: 70.3428 2024/10/25 19:51:54 - mmengine - INFO - Iter(train) [41350/80000] base_lr: 1.1966e-04 lr: 1.1966e-04 eta: 21:00:45 time: 1.9543 data_time: 0.0183 memory: 6647 grad_norm: 5.7285 loss: 0.6231 decode.loss_ce: 0.4134 decode.acc_seg: 90.7137 aux.loss_ce: 0.2098 aux.acc_seg: 84.3646 2024/10/25 19:53:31 - mmengine - INFO - Iter(train) [41400/80000] base_lr: 1.1964e-04 lr: 1.1964e-04 eta: 20:59:07 time: 1.9460 data_time: 0.0190 memory: 6644 grad_norm: 5.3469 loss: 0.6933 decode.loss_ce: 0.4517 decode.acc_seg: 75.8842 aux.loss_ce: 0.2416 aux.acc_seg: 71.6852 2024/10/25 19:55:09 - mmengine - INFO - Iter(train) [41450/80000] base_lr: 1.1961e-04 lr: 1.1961e-04 eta: 20:57:29 time: 1.9498 data_time: 0.0192 memory: 6645 grad_norm: 4.9509 loss: 0.6073 decode.loss_ce: 0.4041 decode.acc_seg: 86.4280 aux.loss_ce: 0.2032 aux.acc_seg: 84.1465 2024/10/25 19:56:46 - mmengine - INFO - Iter(train) [41500/80000] base_lr: 1.1958e-04 lr: 1.1958e-04 eta: 20:55:50 time: 1.9452 data_time: 0.0193 memory: 6647 grad_norm: 6.8495 loss: 0.5375 decode.loss_ce: 0.3445 decode.acc_seg: 89.7045 aux.loss_ce: 0.1930 aux.acc_seg: 84.9592 2024/10/25 19:58:24 - mmengine - INFO - Iter(train) [41550/80000] base_lr: 1.1956e-04 lr: 1.1956e-04 eta: 20:54:13 time: 1.9575 data_time: 0.0180 memory: 6645 grad_norm: 5.7086 loss: 0.6456 decode.loss_ce: 0.4311 decode.acc_seg: 80.5201 aux.loss_ce: 0.2145 aux.acc_seg: 72.4347 2024/10/25 20:00:02 - mmengine - INFO - Iter(train) [41600/80000] base_lr: 1.1953e-04 lr: 1.1953e-04 eta: 20:52:35 time: 1.9522 data_time: 0.0177 memory: 6645 grad_norm: 5.7788 loss: 0.8026 decode.loss_ce: 0.5178 decode.acc_seg: 77.8574 aux.loss_ce: 0.2848 aux.acc_seg: 71.1153 2024/10/25 20:01:43 - mmengine - INFO - Iter(train) [41650/80000] base_lr: 1.1950e-04 lr: 1.1950e-04 eta: 20:51:00 time: 1.9547 data_time: 0.0168 memory: 6646 grad_norm: 5.5988 loss: 0.6367 decode.loss_ce: 0.4111 decode.acc_seg: 76.8696 aux.loss_ce: 0.2256 aux.acc_seg: 70.9024 2024/10/25 20:03:21 - mmengine - INFO - Iter(train) [41700/80000] base_lr: 1.1947e-04 lr: 1.1947e-04 eta: 20:49:22 time: 1.9431 data_time: 0.0192 memory: 6645 grad_norm: 4.2647 loss: 0.5858 decode.loss_ce: 0.3894 decode.acc_seg: 88.5905 aux.loss_ce: 0.1964 aux.acc_seg: 87.5346 2024/10/25 20:04:59 - mmengine - INFO - Iter(train) [41750/80000] base_lr: 1.1943e-04 lr: 1.1943e-04 eta: 20:47:44 time: 1.9490 data_time: 0.0183 memory: 6645 grad_norm: 4.6524 loss: 0.6560 decode.loss_ce: 0.4417 decode.acc_seg: 81.5223 aux.loss_ce: 0.2143 aux.acc_seg: 81.7597 2024/10/25 20:06:36 - mmengine - INFO - Iter(train) [41800/80000] base_lr: 1.1940e-04 lr: 1.1940e-04 eta: 20:46:06 time: 1.9439 data_time: 0.0185 memory: 6645 grad_norm: 5.5099 loss: 0.5802 decode.loss_ce: 0.3824 decode.acc_seg: 91.4702 aux.loss_ce: 0.1979 aux.acc_seg: 82.3095 2024/10/25 20:08:14 - mmengine - INFO - Iter(train) [41850/80000] base_lr: 1.1937e-04 lr: 1.1937e-04 eta: 20:44:27 time: 1.9445 data_time: 0.0175 memory: 6645 grad_norm: 3.5605 loss: 0.6055 decode.loss_ce: 0.4023 decode.acc_seg: 86.2930 aux.loss_ce: 0.2032 aux.acc_seg: 79.3893 2024/10/25 20:09:51 - mmengine - INFO - Iter(train) [41900/80000] base_lr: 1.1933e-04 lr: 1.1933e-04 eta: 20:42:49 time: 1.9484 data_time: 0.0179 memory: 6645 grad_norm: 5.0663 loss: 0.5922 decode.loss_ce: 0.3988 decode.acc_seg: 85.4293 aux.loss_ce: 0.1934 aux.acc_seg: 82.0233 2024/10/25 20:11:29 - mmengine - INFO - Iter(train) [41950/80000] base_lr: 1.1930e-04 lr: 1.1930e-04 eta: 20:41:12 time: 1.9543 data_time: 0.0185 memory: 6645 grad_norm: 5.6711 loss: 0.7170 decode.loss_ce: 0.4830 decode.acc_seg: 81.1277 aux.loss_ce: 0.2340 aux.acc_seg: 80.5610 2024/10/25 20:13:07 - mmengine - INFO - Exp name: deeplabv3_mobilemamba_b4-80k_ade20k-512x512_20241024_212005 2024/10/25 20:13:07 - mmengine - INFO - Iter(train) [42000/80000] base_lr: 1.1926e-04 lr: 1.1926e-04 eta: 20:39:34 time: 1.9579 data_time: 0.0182 memory: 6646 grad_norm: 5.1220 loss: 0.6623 decode.loss_ce: 0.4390 decode.acc_seg: 81.6934 aux.loss_ce: 0.2233 aux.acc_seg: 79.9681 2024/10/25 20:14:45 - mmengine - INFO - Iter(train) [42050/80000] base_lr: 1.1922e-04 lr: 1.1922e-04 eta: 20:37:56 time: 1.9442 data_time: 0.0194 memory: 6646 grad_norm: 7.3045 loss: 0.5862 decode.loss_ce: 0.3840 decode.acc_seg: 79.6234 aux.loss_ce: 0.2021 aux.acc_seg: 75.4701 2024/10/25 20:16:22 - mmengine - INFO - Iter(train) [42100/80000] base_lr: 1.1919e-04 lr: 1.1919e-04 eta: 20:36:17 time: 1.9494 data_time: 0.0195 memory: 6645 grad_norm: 6.8135 loss: 0.7166 decode.loss_ce: 0.4810 decode.acc_seg: 81.6947 aux.loss_ce: 0.2356 aux.acc_seg: 81.5387 2024/10/25 20:18:00 - mmengine - INFO - Iter(train) [42150/80000] base_lr: 1.1915e-04 lr: 1.1915e-04 eta: 20:34:39 time: 1.9743 data_time: 0.0182 memory: 6648 grad_norm: 5.0254 loss: 0.6321 decode.loss_ce: 0.4312 decode.acc_seg: 80.3778 aux.loss_ce: 0.2009 aux.acc_seg: 81.1937 2024/10/25 20:19:37 - mmengine - INFO - Iter(train) [42200/80000] base_lr: 1.1911e-04 lr: 1.1911e-04 eta: 20:33:01 time: 1.9511 data_time: 0.0188 memory: 6647 grad_norm: 5.6036 loss: 0.6206 decode.loss_ce: 0.4164 decode.acc_seg: 85.9934 aux.loss_ce: 0.2042 aux.acc_seg: 85.8739 2024/10/25 20:21:15 - mmengine - INFO - Iter(train) [42250/80000] base_lr: 1.1907e-04 lr: 1.1907e-04 eta: 20:31:23 time: 1.9651 data_time: 0.0182 memory: 6646 grad_norm: 6.4735 loss: 0.6940 decode.loss_ce: 0.4563 decode.acc_seg: 84.5274 aux.loss_ce: 0.2377 aux.acc_seg: 78.9091 2024/10/25 20:22:53 - mmengine - INFO - Iter(train) [42300/80000] base_lr: 1.1902e-04 lr: 1.1902e-04 eta: 20:29:46 time: 1.9520 data_time: 0.0191 memory: 6646 grad_norm: 6.0652 loss: 0.6291 decode.loss_ce: 0.4263 decode.acc_seg: 84.7505 aux.loss_ce: 0.2028 aux.acc_seg: 87.6133 2024/10/25 20:24:31 - mmengine - INFO - Iter(train) [42350/80000] base_lr: 1.1898e-04 lr: 1.1898e-04 eta: 20:28:08 time: 1.9488 data_time: 0.0177 memory: 6646 grad_norm: 5.2286 loss: 0.5640 decode.loss_ce: 0.3762 decode.acc_seg: 83.0078 aux.loss_ce: 0.1878 aux.acc_seg: 83.4303 2024/10/25 20:26:09 - mmengine - INFO - Iter(train) [42400/80000] base_lr: 1.1894e-04 lr: 1.1894e-04 eta: 20:26:30 time: 1.9487 data_time: 0.0180 memory: 6647 grad_norm: 4.0655 loss: 0.6187 decode.loss_ce: 0.4162 decode.acc_seg: 84.8245 aux.loss_ce: 0.2025 aux.acc_seg: 83.0101 2024/10/25 20:27:47 - mmengine - INFO - Iter(train) [42450/80000] base_lr: 1.1889e-04 lr: 1.1889e-04 eta: 20:24:52 time: 1.9543 data_time: 0.0183 memory: 6647 grad_norm: 4.5993 loss: 0.6534 decode.loss_ce: 0.4308 decode.acc_seg: 81.9223 aux.loss_ce: 0.2226 aux.acc_seg: 80.0063 2024/10/25 20:29:24 - mmengine - INFO - Iter(train) [42500/80000] base_lr: 1.1885e-04 lr: 1.1885e-04 eta: 20:23:13 time: 1.9508 data_time: 0.0191 memory: 6646 grad_norm: 5.3295 loss: 0.5695 decode.loss_ce: 0.3749 decode.acc_seg: 82.7740 aux.loss_ce: 0.1946 aux.acc_seg: 82.4365 2024/10/25 20:31:01 - mmengine - INFO - Iter(train) [42550/80000] base_lr: 1.1880e-04 lr: 1.1880e-04 eta: 20:21:35 time: 1.9422 data_time: 0.0185 memory: 6645 grad_norm: 6.5380 loss: 0.6657 decode.loss_ce: 0.4466 decode.acc_seg: 80.3167 aux.loss_ce: 0.2190 aux.acc_seg: 76.7911 2024/10/25 20:32:42 - mmengine - INFO - Iter(train) [42600/80000] base_lr: 1.1875e-04 lr: 1.1875e-04 eta: 20:20:00 time: 1.9548 data_time: 0.0182 memory: 6645 grad_norm: 4.6784 loss: 0.5634 decode.loss_ce: 0.3877 decode.acc_seg: 71.0664 aux.loss_ce: 0.1757 aux.acc_seg: 67.2923 2024/10/25 20:34:20 - mmengine - INFO - Iter(train) [42650/80000] base_lr: 1.1871e-04 lr: 1.1871e-04 eta: 20:18:22 time: 1.9651 data_time: 0.0173 memory: 6645 grad_norm: 5.3089 loss: 0.5953 decode.loss_ce: 0.3943 decode.acc_seg: 81.5628 aux.loss_ce: 0.2011 aux.acc_seg: 80.4690 2024/10/25 20:35:58 - mmengine - INFO - Iter(train) [42700/80000] base_lr: 1.1866e-04 lr: 1.1866e-04 eta: 20:16:44 time: 1.9554 data_time: 0.0196 memory: 6645 grad_norm: 8.0127 loss: 0.6349 decode.loss_ce: 0.4261 decode.acc_seg: 78.0464 aux.loss_ce: 0.2088 aux.acc_seg: 78.7705 2024/10/25 20:37:35 - mmengine - INFO - Iter(train) [42750/80000] base_lr: 1.1861e-04 lr: 1.1861e-04 eta: 20:15:06 time: 1.9538 data_time: 0.0181 memory: 6646 grad_norm: 4.9029 loss: 0.6209 decode.loss_ce: 0.4217 decode.acc_seg: 82.6261 aux.loss_ce: 0.1993 aux.acc_seg: 80.4570 2024/10/25 20:39:13 - mmengine - INFO - Iter(train) [42800/80000] base_lr: 1.1856e-04 lr: 1.1856e-04 eta: 20:13:28 time: 1.9507 data_time: 0.0201 memory: 6645 grad_norm: 6.6824 loss: 0.6969 decode.loss_ce: 0.4522 decode.acc_seg: 84.3974 aux.loss_ce: 0.2447 aux.acc_seg: 81.3723 2024/10/25 20:40:50 - mmengine - INFO - Iter(train) [42850/80000] base_lr: 1.1850e-04 lr: 1.1850e-04 eta: 20:11:50 time: 1.9475 data_time: 0.0181 memory: 6645 grad_norm: 4.5160 loss: 0.6145 decode.loss_ce: 0.4012 decode.acc_seg: 91.7608 aux.loss_ce: 0.2133 aux.acc_seg: 88.6375 2024/10/25 20:42:28 - mmengine - INFO - Iter(train) [42900/80000] base_lr: 1.1845e-04 lr: 1.1845e-04 eta: 20:10:12 time: 1.9521 data_time: 0.0175 memory: 6645 grad_norm: 8.3895 loss: 0.7803 decode.loss_ce: 0.5244 decode.acc_seg: 78.5133 aux.loss_ce: 0.2560 aux.acc_seg: 75.6242 2024/10/25 20:44:06 - mmengine - INFO - Iter(train) [42950/80000] base_lr: 1.1840e-04 lr: 1.1840e-04 eta: 20:08:34 time: 1.9447 data_time: 0.0187 memory: 6646 grad_norm: 6.0636 loss: 0.6705 decode.loss_ce: 0.4452 decode.acc_seg: 87.1344 aux.loss_ce: 0.2252 aux.acc_seg: 84.5279 2024/10/25 20:45:44 - mmengine - INFO - Exp name: deeplabv3_mobilemamba_b4-80k_ade20k-512x512_20241024_212005 2024/10/25 20:45:44 - mmengine - INFO - Iter(train) [43000/80000] base_lr: 1.1834e-04 lr: 1.1834e-04 eta: 20:06:56 time: 1.9502 data_time: 0.0183 memory: 6646 grad_norm: 4.5459 loss: 0.6775 decode.loss_ce: 0.4485 decode.acc_seg: 87.2014 aux.loss_ce: 0.2290 aux.acc_seg: 86.5725 2024/10/25 20:47:22 - mmengine - INFO - Iter(train) [43050/80000] base_lr: 1.1829e-04 lr: 1.1829e-04 eta: 20:05:18 time: 1.9477 data_time: 0.0185 memory: 6646 grad_norm: 6.0730 loss: 0.6126 decode.loss_ce: 0.4164 decode.acc_seg: 81.7486 aux.loss_ce: 0.1962 aux.acc_seg: 75.6473 2024/10/25 20:48:59 - mmengine - INFO - Iter(train) [43100/80000] base_lr: 1.1823e-04 lr: 1.1823e-04 eta: 20:03:40 time: 1.9419 data_time: 0.0192 memory: 6646 grad_norm: 4.5987 loss: 0.6014 decode.loss_ce: 0.3974 decode.acc_seg: 89.2835 aux.loss_ce: 0.2040 aux.acc_seg: 78.9381 2024/10/25 20:50:37 - mmengine - INFO - Iter(train) [43150/80000] base_lr: 1.1817e-04 lr: 1.1817e-04 eta: 20:02:02 time: 1.9544 data_time: 0.0177 memory: 6645 grad_norm: 7.5701 loss: 0.6296 decode.loss_ce: 0.4151 decode.acc_seg: 85.2740 aux.loss_ce: 0.2145 aux.acc_seg: 81.8690 2024/10/25 20:52:14 - mmengine - INFO - Iter(train) [43200/80000] base_lr: 1.1812e-04 lr: 1.1812e-04 eta: 20:00:24 time: 1.9450 data_time: 0.0180 memory: 6645 grad_norm: 4.7149 loss: 0.6164 decode.loss_ce: 0.4084 decode.acc_seg: 89.3288 aux.loss_ce: 0.2081 aux.acc_seg: 84.3514 2024/10/25 20:53:52 - mmengine - INFO - Iter(train) [43250/80000] base_lr: 1.1806e-04 lr: 1.1806e-04 eta: 19:58:46 time: 1.9645 data_time: 0.0165 memory: 6645 grad_norm: 5.2235 loss: 0.6875 decode.loss_ce: 0.4523 decode.acc_seg: 83.5731 aux.loss_ce: 0.2352 aux.acc_seg: 81.6492 2024/10/25 20:55:30 - mmengine - INFO - Iter(train) [43300/80000] base_lr: 1.1800e-04 lr: 1.1800e-04 eta: 19:57:08 time: 1.9534 data_time: 0.0172 memory: 6645 grad_norm: 5.6884 loss: 0.5566 decode.loss_ce: 0.3806 decode.acc_seg: 86.8054 aux.loss_ce: 0.1760 aux.acc_seg: 87.0519 2024/10/25 20:57:07 - mmengine - INFO - Iter(train) [43350/80000] base_lr: 1.1794e-04 lr: 1.1794e-04 eta: 19:55:29 time: 1.9488 data_time: 0.0180 memory: 6647 grad_norm: 5.6187 loss: 0.5518 decode.loss_ce: 0.3655 decode.acc_seg: 85.4592 aux.loss_ce: 0.1862 aux.acc_seg: 82.6335 2024/10/25 20:58:45 - mmengine - INFO - Iter(train) [43400/80000] base_lr: 1.1787e-04 lr: 1.1787e-04 eta: 19:53:52 time: 1.9484 data_time: 0.0183 memory: 6645 grad_norm: 5.4896 loss: 0.6684 decode.loss_ce: 0.4406 decode.acc_seg: 83.2234 aux.loss_ce: 0.2278 aux.acc_seg: 82.7491 2024/10/25 21:00:23 - mmengine - INFO - Iter(train) [43450/80000] base_lr: 1.1781e-04 lr: 1.1781e-04 eta: 19:52:13 time: 1.9516 data_time: 0.0187 memory: 6646 grad_norm: 8.0065 loss: 0.6652 decode.loss_ce: 0.4391 decode.acc_seg: 84.3550 aux.loss_ce: 0.2261 aux.acc_seg: 85.5409 2024/10/25 21:02:00 - mmengine - INFO - Iter(train) [43500/80000] base_lr: 1.1775e-04 lr: 1.1775e-04 eta: 19:50:35 time: 1.9451 data_time: 0.0190 memory: 6645 grad_norm: 5.7967 loss: 0.6410 decode.loss_ce: 0.4363 decode.acc_seg: 80.1163 aux.loss_ce: 0.2046 aux.acc_seg: 79.8244 2024/10/25 21:03:38 - mmengine - INFO - Iter(train) [43550/80000] base_lr: 1.1768e-04 lr: 1.1768e-04 eta: 19:48:58 time: 1.9559 data_time: 0.0187 memory: 6645 grad_norm: 5.1806 loss: 0.5973 decode.loss_ce: 0.3974 decode.acc_seg: 84.1043 aux.loss_ce: 0.1999 aux.acc_seg: 83.3905 2024/10/25 21:05:16 - mmengine - INFO - Iter(train) [43600/80000] base_lr: 1.1762e-04 lr: 1.1762e-04 eta: 19:47:19 time: 1.9486 data_time: 0.0185 memory: 6645 grad_norm: 4.6071 loss: 0.7318 decode.loss_ce: 0.4812 decode.acc_seg: 81.2989 aux.loss_ce: 0.2506 aux.acc_seg: 77.5988 2024/10/25 21:06:54 - mmengine - INFO - Iter(train) [43650/80000] base_lr: 1.1755e-04 lr: 1.1755e-04 eta: 19:45:42 time: 1.9699 data_time: 0.0184 memory: 6645 grad_norm: 6.2299 loss: 0.6292 decode.loss_ce: 0.4159 decode.acc_seg: 87.6566 aux.loss_ce: 0.2133 aux.acc_seg: 87.5069 2024/10/25 21:08:32 - mmengine - INFO - Iter(train) [43700/80000] base_lr: 1.1749e-04 lr: 1.1749e-04 eta: 19:44:04 time: 1.9488 data_time: 0.0182 memory: 6648 grad_norm: 4.9708 loss: 0.5564 decode.loss_ce: 0.3785 decode.acc_seg: 88.4001 aux.loss_ce: 0.1779 aux.acc_seg: 80.1396 2024/10/25 21:10:09 - mmengine - INFO - Iter(train) [43750/80000] base_lr: 1.1742e-04 lr: 1.1742e-04 eta: 19:42:26 time: 1.9480 data_time: 0.0185 memory: 6647 grad_norm: 4.5446 loss: 0.5569 decode.loss_ce: 0.3818 decode.acc_seg: 91.4643 aux.loss_ce: 0.1751 aux.acc_seg: 91.8953 2024/10/25 21:11:47 - mmengine - INFO - Iter(train) [43800/80000] base_lr: 1.1735e-04 lr: 1.1735e-04 eta: 19:40:48 time: 1.9499 data_time: 0.0198 memory: 6647 grad_norm: 6.0952 loss: 0.6682 decode.loss_ce: 0.4285 decode.acc_seg: 81.0871 aux.loss_ce: 0.2397 aux.acc_seg: 73.2534 2024/10/25 21:13:25 - mmengine - INFO - Iter(train) [43850/80000] base_lr: 1.1728e-04 lr: 1.1728e-04 eta: 19:39:10 time: 1.9490 data_time: 0.0184 memory: 6645 grad_norm: 5.6174 loss: 0.7224 decode.loss_ce: 0.4904 decode.acc_seg: 79.6150 aux.loss_ce: 0.2320 aux.acc_seg: 81.4860 2024/10/25 21:15:02 - mmengine - INFO - Iter(train) [43900/80000] base_lr: 1.1721e-04 lr: 1.1721e-04 eta: 19:37:32 time: 1.9484 data_time: 0.0186 memory: 6646 grad_norm: 6.2528 loss: 0.6385 decode.loss_ce: 0.4237 decode.acc_seg: 89.5688 aux.loss_ce: 0.2148 aux.acc_seg: 85.7920 2024/10/25 21:16:43 - mmengine - INFO - Iter(train) [43950/80000] base_lr: 1.1714e-04 lr: 1.1714e-04 eta: 19:35:56 time: 1.9485 data_time: 0.0188 memory: 6646 grad_norm: 4.2974 loss: 0.5758 decode.loss_ce: 0.3848 decode.acc_seg: 80.6734 aux.loss_ce: 0.1910 aux.acc_seg: 79.9195 2024/10/25 21:18:20 - mmengine - INFO - Exp name: deeplabv3_mobilemamba_b4-80k_ade20k-512x512_20241024_212005 2024/10/25 21:18:20 - mmengine - INFO - Iter(train) [44000/80000] base_lr: 1.1706e-04 lr: 1.1706e-04 eta: 19:34:18 time: 1.9564 data_time: 0.0192 memory: 6646 grad_norm: 6.7669 loss: 0.6670 decode.loss_ce: 0.4483 decode.acc_seg: 87.4658 aux.loss_ce: 0.2186 aux.acc_seg: 84.8966 2024/10/25 21:19:58 - mmengine - INFO - Iter(train) [44050/80000] base_lr: 1.1699e-04 lr: 1.1699e-04 eta: 19:32:40 time: 1.9493 data_time: 0.0188 memory: 6647 grad_norm: 5.4224 loss: 0.6254 decode.loss_ce: 0.4150 decode.acc_seg: 78.7871 aux.loss_ce: 0.2105 aux.acc_seg: 77.6844 2024/10/25 21:21:36 - mmengine - INFO - Iter(train) [44100/80000] base_lr: 1.1692e-04 lr: 1.1692e-04 eta: 19:31:02 time: 1.9514 data_time: 0.0185 memory: 6645 grad_norm: 7.9556 loss: 0.5907 decode.loss_ce: 0.4006 decode.acc_seg: 84.1663 aux.loss_ce: 0.1902 aux.acc_seg: 79.7531 2024/10/25 21:23:14 - mmengine - INFO - Iter(train) [44150/80000] base_lr: 1.1684e-04 lr: 1.1684e-04 eta: 19:29:24 time: 1.9489 data_time: 0.0196 memory: 6645 grad_norm: 4.0493 loss: 0.5765 decode.loss_ce: 0.3889 decode.acc_seg: 85.7976 aux.loss_ce: 0.1876 aux.acc_seg: 82.7493 2024/10/25 21:24:51 - mmengine - INFO - Iter(train) [44200/80000] base_lr: 1.1677e-04 lr: 1.1677e-04 eta: 19:27:46 time: 1.9504 data_time: 0.0186 memory: 6647 grad_norm: 6.1354 loss: 0.5624 decode.loss_ce: 0.3796 decode.acc_seg: 86.6978 aux.loss_ce: 0.1828 aux.acc_seg: 84.5677 2024/10/25 21:26:29 - mmengine - INFO - Iter(train) [44250/80000] base_lr: 1.1669e-04 lr: 1.1669e-04 eta: 19:26:08 time: 1.9657 data_time: 0.0189 memory: 6646 grad_norm: 6.8834 loss: 0.5739 decode.loss_ce: 0.3736 decode.acc_seg: 77.1888 aux.loss_ce: 0.2003 aux.acc_seg: 75.8700 2024/10/25 21:28:07 - mmengine - INFO - Iter(train) [44300/80000] base_lr: 1.1661e-04 lr: 1.1661e-04 eta: 19:24:30 time: 1.9535 data_time: 0.0200 memory: 6646 grad_norm: 7.1700 loss: 0.6146 decode.loss_ce: 0.3943 decode.acc_seg: 86.6859 aux.loss_ce: 0.2203 aux.acc_seg: 85.5700 2024/10/25 21:29:45 - mmengine - INFO - Iter(train) [44350/80000] base_lr: 1.1653e-04 lr: 1.1653e-04 eta: 19:22:52 time: 1.9520 data_time: 0.0177 memory: 6646 grad_norm: 7.1828 loss: 0.6088 decode.loss_ce: 0.4109 decode.acc_seg: 85.5460 aux.loss_ce: 0.1979 aux.acc_seg: 86.0667 2024/10/25 21:31:23 - mmengine - INFO - Iter(train) [44400/80000] base_lr: 1.1645e-04 lr: 1.1645e-04 eta: 19:21:15 time: 1.9659 data_time: 0.0192 memory: 6645 grad_norm: 5.1069 loss: 0.6553 decode.loss_ce: 0.4331 decode.acc_seg: 85.8589 aux.loss_ce: 0.2222 aux.acc_seg: 79.9871 2024/10/25 21:33:00 - mmengine - INFO - Iter(train) [44450/80000] base_lr: 1.1637e-04 lr: 1.1637e-04 eta: 19:19:36 time: 1.9473 data_time: 0.0191 memory: 6645 grad_norm: 5.7771 loss: 0.5861 decode.loss_ce: 0.3915 decode.acc_seg: 86.3446 aux.loss_ce: 0.1946 aux.acc_seg: 87.9060 2024/10/25 21:34:43 - mmengine - INFO - Iter(train) [44500/80000] base_lr: 1.1629e-04 lr: 1.1629e-04 eta: 19:18:02 time: 1.9454 data_time: 0.0185 memory: 6646 grad_norm: 4.2887 loss: 0.5563 decode.loss_ce: 0.3671 decode.acc_seg: 84.5972 aux.loss_ce: 0.1893 aux.acc_seg: 69.1399 2024/10/25 21:36:20 - mmengine - INFO - Iter(train) [44550/80000] base_lr: 1.1621e-04 lr: 1.1621e-04 eta: 19:16:24 time: 1.9555 data_time: 0.0181 memory: 6645 grad_norm: 5.4550 loss: 0.5699 decode.loss_ce: 0.3950 decode.acc_seg: 89.3656 aux.loss_ce: 0.1749 aux.acc_seg: 86.8186 2024/10/25 21:37:58 - mmengine - INFO - Iter(train) [44600/80000] base_lr: 1.1613e-04 lr: 1.1613e-04 eta: 19:14:46 time: 1.9583 data_time: 0.0179 memory: 6646 grad_norm: 7.0317 loss: 0.6197 decode.loss_ce: 0.4156 decode.acc_seg: 86.1491 aux.loss_ce: 0.2041 aux.acc_seg: 85.4290 2024/10/25 21:39:36 - mmengine - INFO - Iter(train) [44650/80000] base_lr: 1.1604e-04 lr: 1.1604e-04 eta: 19:13:08 time: 1.9508 data_time: 0.0195 memory: 6645 grad_norm: 5.7995 loss: 0.5307 decode.loss_ce: 0.3377 decode.acc_seg: 84.6578 aux.loss_ce: 0.1930 aux.acc_seg: 78.3325 2024/10/25 21:41:13 - mmengine - INFO - Iter(train) [44700/80000] base_lr: 1.1596e-04 lr: 1.1596e-04 eta: 19:11:30 time: 1.9593 data_time: 0.0193 memory: 6645 grad_norm: 5.7039 loss: 0.5723 decode.loss_ce: 0.3829 decode.acc_seg: 86.6017 aux.loss_ce: 0.1894 aux.acc_seg: 85.9214 2024/10/25 21:42:51 - mmengine - INFO - Iter(train) [44750/80000] base_lr: 1.1587e-04 lr: 1.1587e-04 eta: 19:09:52 time: 1.9493 data_time: 0.0188 memory: 6646 grad_norm: 7.0360 loss: 0.6693 decode.loss_ce: 0.4341 decode.acc_seg: 85.3919 aux.loss_ce: 0.2352 aux.acc_seg: 86.3521 2024/10/25 21:44:28 - mmengine - INFO - Iter(train) [44800/80000] base_lr: 1.1579e-04 lr: 1.1579e-04 eta: 19:08:14 time: 1.9552 data_time: 0.0183 memory: 6645 grad_norm: 7.7218 loss: 0.5798 decode.loss_ce: 0.3992 decode.acc_seg: 70.5336 aux.loss_ce: 0.1806 aux.acc_seg: 70.1799 2024/10/25 21:46:06 - mmengine - INFO - Iter(train) [44850/80000] base_lr: 1.1570e-04 lr: 1.1570e-04 eta: 19:06:36 time: 1.9573 data_time: 0.0182 memory: 6645 grad_norm: 5.8126 loss: 0.6677 decode.loss_ce: 0.4462 decode.acc_seg: 78.8744 aux.loss_ce: 0.2215 aux.acc_seg: 79.5766 2024/10/25 21:47:44 - mmengine - INFO - Iter(train) [44900/80000] base_lr: 1.1561e-04 lr: 1.1561e-04 eta: 19:04:58 time: 1.9553 data_time: 0.0173 memory: 6646 grad_norm: 4.3045 loss: 0.6597 decode.loss_ce: 0.4285 decode.acc_seg: 84.8621 aux.loss_ce: 0.2312 aux.acc_seg: 83.7973 2024/10/25 21:49:22 - mmengine - INFO - Iter(train) [44950/80000] base_lr: 1.1552e-04 lr: 1.1552e-04 eta: 19:03:20 time: 1.9527 data_time: 0.0185 memory: 6645 grad_norm: 5.2102 loss: 0.5404 decode.loss_ce: 0.3591 decode.acc_seg: 85.9060 aux.loss_ce: 0.1813 aux.acc_seg: 78.5042 2024/10/25 21:51:00 - mmengine - INFO - Exp name: deeplabv3_mobilemamba_b4-80k_ade20k-512x512_20241024_212005 2024/10/25 21:51:00 - mmengine - INFO - Iter(train) [45000/80000] base_lr: 1.1543e-04 lr: 1.1543e-04 eta: 19:01:42 time: 1.9995 data_time: 0.0183 memory: 6647 grad_norm: 4.3718 loss: 0.5470 decode.loss_ce: 0.3647 decode.acc_seg: 77.6702 aux.loss_ce: 0.1823 aux.acc_seg: 73.6673 2024/10/25 21:52:38 - mmengine - INFO - Iter(train) [45050/80000] base_lr: 1.1534e-04 lr: 1.1534e-04 eta: 19:00:04 time: 1.9478 data_time: 0.0189 memory: 6646 grad_norm: 4.1209 loss: 0.5971 decode.loss_ce: 0.4134 decode.acc_seg: 81.8315 aux.loss_ce: 0.1837 aux.acc_seg: 82.2314 2024/10/25 21:54:15 - mmengine - INFO - Iter(train) [45100/80000] base_lr: 1.1525e-04 lr: 1.1525e-04 eta: 18:58:26 time: 1.9501 data_time: 0.0185 memory: 6646 grad_norm: 4.5204 loss: 0.5673 decode.loss_ce: 0.3807 decode.acc_seg: 84.2745 aux.loss_ce: 0.1866 aux.acc_seg: 83.9258 2024/10/25 21:55:53 - mmengine - INFO - Iter(train) [45150/80000] base_lr: 1.1516e-04 lr: 1.1516e-04 eta: 18:56:48 time: 1.9506 data_time: 0.0188 memory: 6646 grad_norm: 5.0210 loss: 0.6312 decode.loss_ce: 0.4204 decode.acc_seg: 89.7845 aux.loss_ce: 0.2108 aux.acc_seg: 87.2339 2024/10/25 21:57:30 - mmengine - INFO - Iter(train) [45200/80000] base_lr: 1.1507e-04 lr: 1.1507e-04 eta: 18:55:10 time: 1.9452 data_time: 0.0194 memory: 6647 grad_norm: 5.5638 loss: 0.4870 decode.loss_ce: 0.3268 decode.acc_seg: 90.4070 aux.loss_ce: 0.1601 aux.acc_seg: 88.2517 2024/10/25 21:59:08 - mmengine - INFO - Iter(train) [45250/80000] base_lr: 1.1497e-04 lr: 1.1497e-04 eta: 18:53:32 time: 1.9462 data_time: 0.0191 memory: 6645 grad_norm: 4.8526 loss: 0.6521 decode.loss_ce: 0.4303 decode.acc_seg: 78.3803 aux.loss_ce: 0.2218 aux.acc_seg: 78.1445 2024/10/25 22:00:46 - mmengine - INFO - Iter(train) [45300/80000] base_lr: 1.1488e-04 lr: 1.1488e-04 eta: 18:51:54 time: 1.9523 data_time: 0.0199 memory: 6646 grad_norm: 5.2437 loss: 0.6921 decode.loss_ce: 0.4548 decode.acc_seg: 85.9350 aux.loss_ce: 0.2373 aux.acc_seg: 82.3419 2024/10/25 22:02:24 - mmengine - INFO - Iter(train) [45350/80000] base_lr: 1.1478e-04 lr: 1.1478e-04 eta: 18:50:16 time: 1.9475 data_time: 0.0177 memory: 6645 grad_norm: 5.6151 loss: 0.6561 decode.loss_ce: 0.4309 decode.acc_seg: 79.0634 aux.loss_ce: 0.2252 aux.acc_seg: 74.5013 2024/10/25 22:04:02 - mmengine - INFO - Iter(train) [45400/80000] base_lr: 1.1469e-04 lr: 1.1469e-04 eta: 18:48:39 time: 1.9559 data_time: 0.0200 memory: 6645 grad_norm: 4.5333 loss: 0.6029 decode.loss_ce: 0.3952 decode.acc_seg: 87.4505 aux.loss_ce: 0.2076 aux.acc_seg: 82.3035 2024/10/25 22:05:43 - mmengine - INFO - Iter(train) [45450/80000] base_lr: 1.1459e-04 lr: 1.1459e-04 eta: 18:47:04 time: 1.9383 data_time: 0.0179 memory: 6645 grad_norm: 5.5935 loss: 0.6949 decode.loss_ce: 0.4370 decode.acc_seg: 86.4813 aux.loss_ce: 0.2579 aux.acc_seg: 71.1596 2024/10/25 22:07:21 - mmengine - INFO - Iter(train) [45500/80000] base_lr: 1.1449e-04 lr: 1.1449e-04 eta: 18:45:26 time: 1.9505 data_time: 0.0191 memory: 6647 grad_norm: 4.6897 loss: 0.5213 decode.loss_ce: 0.3450 decode.acc_seg: 86.6768 aux.loss_ce: 0.1763 aux.acc_seg: 85.3311 2024/10/25 22:08:59 - mmengine - INFO - Iter(train) [45550/80000] base_lr: 1.1439e-04 lr: 1.1439e-04 eta: 18:43:48 time: 1.9604 data_time: 0.0185 memory: 6646 grad_norm: 4.7716 loss: 0.6340 decode.loss_ce: 0.4238 decode.acc_seg: 74.6583 aux.loss_ce: 0.2102 aux.acc_seg: 72.9766 2024/10/25 22:10:36 - mmengine - INFO - Iter(train) [45600/80000] base_lr: 1.1429e-04 lr: 1.1429e-04 eta: 18:42:10 time: 1.9757 data_time: 0.0183 memory: 6645 grad_norm: 5.8286 loss: 0.5108 decode.loss_ce: 0.3445 decode.acc_seg: 83.2981 aux.loss_ce: 0.1663 aux.acc_seg: 81.8932 2024/10/25 22:12:14 - mmengine - INFO - Iter(train) [45650/80000] base_lr: 1.1419e-04 lr: 1.1419e-04 eta: 18:40:32 time: 1.9525 data_time: 0.0184 memory: 6645 grad_norm: 5.4201 loss: 0.5240 decode.loss_ce: 0.3451 decode.acc_seg: 88.6066 aux.loss_ce: 0.1789 aux.acc_seg: 83.0849 2024/10/25 22:13:52 - mmengine - INFO - Iter(train) [45700/80000] base_lr: 1.1409e-04 lr: 1.1409e-04 eta: 18:38:54 time: 1.9500 data_time: 0.0180 memory: 6646 grad_norm: 5.2885 loss: 0.7251 decode.loss_ce: 0.4845 decode.acc_seg: 82.3583 aux.loss_ce: 0.2406 aux.acc_seg: 80.3211 2024/10/25 22:15:31 - mmengine - INFO - Iter(train) [45750/80000] base_lr: 1.1399e-04 lr: 1.1399e-04 eta: 18:37:16 time: 1.9550 data_time: 0.0188 memory: 6645 grad_norm: 4.4244 loss: 0.5375 decode.loss_ce: 0.3578 decode.acc_seg: 80.6349 aux.loss_ce: 0.1797 aux.acc_seg: 76.1188 2024/10/25 22:17:08 - mmengine - INFO - Iter(train) [45800/80000] base_lr: 1.1388e-04 lr: 1.1388e-04 eta: 18:35:38 time: 1.9522 data_time: 0.0187 memory: 6645 grad_norm: 4.1448 loss: 0.5916 decode.loss_ce: 0.3949 decode.acc_seg: 80.4244 aux.loss_ce: 0.1967 aux.acc_seg: 81.9568 2024/10/25 22:18:46 - mmengine - INFO - Iter(train) [45850/80000] base_lr: 1.1378e-04 lr: 1.1378e-04 eta: 18:34:00 time: 1.9523 data_time: 0.0186 memory: 6646 grad_norm: 6.1642 loss: 0.5875 decode.loss_ce: 0.3961 decode.acc_seg: 80.7124 aux.loss_ce: 0.1914 aux.acc_seg: 77.6844 2024/10/25 22:20:24 - mmengine - INFO - Iter(train) [45900/80000] base_lr: 1.1367e-04 lr: 1.1367e-04 eta: 18:32:22 time: 1.9477 data_time: 0.0189 memory: 6646 grad_norm: 6.4187 loss: 0.6177 decode.loss_ce: 0.4127 decode.acc_seg: 83.9235 aux.loss_ce: 0.2051 aux.acc_seg: 77.8427 2024/10/25 22:22:01 - mmengine - INFO - Iter(train) [45950/80000] base_lr: 1.1357e-04 lr: 1.1357e-04 eta: 18:30:44 time: 1.9677 data_time: 0.0180 memory: 6645 grad_norm: 6.8066 loss: 0.5869 decode.loss_ce: 0.4075 decode.acc_seg: 78.8934 aux.loss_ce: 0.1794 aux.acc_seg: 80.4807 2024/10/25 22:23:43 - mmengine - INFO - Exp name: deeplabv3_mobilemamba_b4-80k_ade20k-512x512_20241024_212005 2024/10/25 22:23:43 - mmengine - INFO - Iter(train) [46000/80000] base_lr: 1.1346e-04 lr: 1.1346e-04 eta: 18:29:09 time: 1.9422 data_time: 0.0183 memory: 6645 grad_norm: 7.0151 loss: 0.6059 decode.loss_ce: 0.3982 decode.acc_seg: 86.6911 aux.loss_ce: 0.2077 aux.acc_seg: 79.8298 2024/10/25 22:25:20 - mmengine - INFO - Iter(train) [46050/80000] base_lr: 1.1336e-04 lr: 1.1336e-04 eta: 18:27:31 time: 1.9509 data_time: 0.0187 memory: 6645 grad_norm: 5.8459 loss: 0.6669 decode.loss_ce: 0.4399 decode.acc_seg: 90.4086 aux.loss_ce: 0.2269 aux.acc_seg: 89.3712 2024/10/25 22:26:58 - mmengine - INFO - Iter(train) [46100/80000] base_lr: 1.1325e-04 lr: 1.1325e-04 eta: 18:25:53 time: 1.9481 data_time: 0.0188 memory: 6646 grad_norm: 5.3879 loss: 0.5598 decode.loss_ce: 0.3700 decode.acc_seg: 87.8281 aux.loss_ce: 0.1899 aux.acc_seg: 76.3542 2024/10/25 22:28:36 - mmengine - INFO - Iter(train) [46150/80000] base_lr: 1.1314e-04 lr: 1.1314e-04 eta: 18:24:15 time: 1.9574 data_time: 0.0179 memory: 6646 grad_norm: 4.8485 loss: 0.5550 decode.loss_ce: 0.3699 decode.acc_seg: 82.8322 aux.loss_ce: 0.1851 aux.acc_seg: 80.2549 2024/10/25 22:30:13 - mmengine - INFO - Iter(train) [46200/80000] base_lr: 1.1303e-04 lr: 1.1303e-04 eta: 18:22:37 time: 1.9555 data_time: 0.0199 memory: 6645 grad_norm: 7.1398 loss: 0.5659 decode.loss_ce: 0.3806 decode.acc_seg: 79.3061 aux.loss_ce: 0.1853 aux.acc_seg: 76.8370 2024/10/25 22:31:51 - mmengine - INFO - Iter(train) [46250/80000] base_lr: 1.1292e-04 lr: 1.1292e-04 eta: 18:20:59 time: 1.9511 data_time: 0.0195 memory: 6645 grad_norm: 4.2743 loss: 0.6400 decode.loss_ce: 0.4191 decode.acc_seg: 90.2608 aux.loss_ce: 0.2209 aux.acc_seg: 82.9053 2024/10/25 22:33:29 - mmengine - INFO - Iter(train) [46300/80000] base_lr: 1.1281e-04 lr: 1.1281e-04 eta: 18:19:21 time: 1.9540 data_time: 0.0206 memory: 6645 grad_norm: 4.4919 loss: 0.5970 decode.loss_ce: 0.3939 decode.acc_seg: 87.4702 aux.loss_ce: 0.2031 aux.acc_seg: 87.1937 2024/10/25 22:35:06 - mmengine - INFO - Iter(train) [46350/80000] base_lr: 1.1269e-04 lr: 1.1269e-04 eta: 18:17:43 time: 1.9496 data_time: 0.0184 memory: 6646 grad_norm: 4.5891 loss: 0.6471 decode.loss_ce: 0.4411 decode.acc_seg: 86.1202 aux.loss_ce: 0.2059 aux.acc_seg: 85.9331 2024/10/25 22:36:45 - mmengine - INFO - Iter(train) [46400/80000] base_lr: 1.1258e-04 lr: 1.1258e-04 eta: 18:16:05 time: 1.9471 data_time: 0.0190 memory: 6646 grad_norm: 5.4187 loss: 0.5972 decode.loss_ce: 0.3859 decode.acc_seg: 75.5673 aux.loss_ce: 0.2114 aux.acc_seg: 70.6721 2024/10/25 22:38:22 - mmengine - INFO - Iter(train) [46450/80000] base_lr: 1.1247e-04 lr: 1.1247e-04 eta: 18:14:27 time: 1.9502 data_time: 0.0190 memory: 6645 grad_norm: 6.5954 loss: 0.6163 decode.loss_ce: 0.4062 decode.acc_seg: 78.2891 aux.loss_ce: 0.2101 aux.acc_seg: 75.3188 2024/10/25 22:40:00 - mmengine - INFO - Iter(train) [46500/80000] base_lr: 1.1235e-04 lr: 1.1235e-04 eta: 18:12:49 time: 1.9536 data_time: 0.0201 memory: 6646 grad_norm: 3.9798 loss: 0.5992 decode.loss_ce: 0.4146 decode.acc_seg: 84.3246 aux.loss_ce: 0.1845 aux.acc_seg: 81.3876 2024/10/25 22:41:37 - mmengine - INFO - Iter(train) [46550/80000] base_lr: 1.1224e-04 lr: 1.1224e-04 eta: 18:11:11 time: 1.9455 data_time: 0.0191 memory: 6646 grad_norm: 4.3065 loss: 0.5898 decode.loss_ce: 0.4020 decode.acc_seg: 85.6444 aux.loss_ce: 0.1878 aux.acc_seg: 85.7945 2024/10/25 22:43:15 - mmengine - INFO - Iter(train) [46600/80000] base_lr: 1.1212e-04 lr: 1.1212e-04 eta: 18:09:33 time: 1.9550 data_time: 0.0181 memory: 6645 grad_norm: 3.8580 loss: 0.6010 decode.loss_ce: 0.4022 decode.acc_seg: 76.9479 aux.loss_ce: 0.1988 aux.acc_seg: 77.0918 2024/10/25 22:44:52 - mmengine - INFO - Iter(train) [46650/80000] base_lr: 1.1200e-04 lr: 1.1200e-04 eta: 18:07:55 time: 1.9560 data_time: 0.0194 memory: 6645 grad_norm: 6.1573 loss: 0.6091 decode.loss_ce: 0.4080 decode.acc_seg: 82.6705 aux.loss_ce: 0.2011 aux.acc_seg: 79.6750 2024/10/25 22:46:30 - mmengine - INFO - Iter(train) [46700/80000] base_lr: 1.1189e-04 lr: 1.1189e-04 eta: 18:06:17 time: 1.9483 data_time: 0.0193 memory: 6645 grad_norm: 6.2426 loss: 0.6032 decode.loss_ce: 0.3977 decode.acc_seg: 80.8533 aux.loss_ce: 0.2055 aux.acc_seg: 82.6222 2024/10/25 22:48:07 - mmengine - INFO - Iter(train) [46750/80000] base_lr: 1.1177e-04 lr: 1.1177e-04 eta: 18:04:39 time: 1.9549 data_time: 0.0185 memory: 6645 grad_norm: 5.6302 loss: 0.7331 decode.loss_ce: 0.4820 decode.acc_seg: 86.9551 aux.loss_ce: 0.2511 aux.acc_seg: 85.3443 2024/10/25 22:49:45 - mmengine - INFO - Iter(train) [46800/80000] base_lr: 1.1165e-04 lr: 1.1165e-04 eta: 18:03:00 time: 1.9458 data_time: 0.0181 memory: 6645 grad_norm: 5.4420 loss: 0.6091 decode.loss_ce: 0.4064 decode.acc_seg: 78.2379 aux.loss_ce: 0.2027 aux.acc_seg: 72.4294 2024/10/25 22:51:22 - mmengine - INFO - Iter(train) [46850/80000] base_lr: 1.1153e-04 lr: 1.1153e-04 eta: 18:01:22 time: 1.9463 data_time: 0.0192 memory: 6644 grad_norm: 4.8892 loss: 0.6441 decode.loss_ce: 0.4331 decode.acc_seg: 84.2496 aux.loss_ce: 0.2110 aux.acc_seg: 75.5239 2024/10/25 22:53:00 - mmengine - INFO - Iter(train) [46900/80000] base_lr: 1.1141e-04 lr: 1.1141e-04 eta: 17:59:44 time: 1.9526 data_time: 0.0195 memory: 6646 grad_norm: 6.1791 loss: 0.7124 decode.loss_ce: 0.4597 decode.acc_seg: 76.0128 aux.loss_ce: 0.2526 aux.acc_seg: 75.2526 2024/10/25 22:54:38 - mmengine - INFO - Iter(train) [46950/80000] base_lr: 1.1128e-04 lr: 1.1128e-04 eta: 17:58:06 time: 1.9468 data_time: 0.0186 memory: 6644 grad_norm: 6.1956 loss: 0.6088 decode.loss_ce: 0.3957 decode.acc_seg: 87.7999 aux.loss_ce: 0.2131 aux.acc_seg: 86.7735 2024/10/25 22:56:16 - mmengine - INFO - Exp name: deeplabv3_mobilemamba_b4-80k_ade20k-512x512_20241024_212005 2024/10/25 22:56:16 - mmengine - INFO - Iter(train) [47000/80000] base_lr: 1.1116e-04 lr: 1.1116e-04 eta: 17:56:29 time: 1.9739 data_time: 0.0194 memory: 6646 grad_norm: 5.5805 loss: 0.5741 decode.loss_ce: 0.3847 decode.acc_seg: 80.8207 aux.loss_ce: 0.1894 aux.acc_seg: 80.6943 2024/10/25 22:57:54 - mmengine - INFO - Iter(train) [47050/80000] base_lr: 1.1104e-04 lr: 1.1104e-04 eta: 17:54:51 time: 1.9537 data_time: 0.0190 memory: 6646 grad_norm: 6.7241 loss: 0.5209 decode.loss_ce: 0.3421 decode.acc_seg: 87.5498 aux.loss_ce: 0.1788 aux.acc_seg: 87.0419 2024/10/25 22:59:31 - mmengine - INFO - Iter(train) [47100/80000] base_lr: 1.1091e-04 lr: 1.1091e-04 eta: 17:53:13 time: 1.9546 data_time: 0.0188 memory: 6645 grad_norm: 5.4687 loss: 0.5986 decode.loss_ce: 0.4144 decode.acc_seg: 81.9234 aux.loss_ce: 0.1842 aux.acc_seg: 77.4657 2024/10/25 23:01:09 - mmengine - INFO - Iter(train) [47150/80000] base_lr: 1.1079e-04 lr: 1.1079e-04 eta: 17:51:35 time: 1.9519 data_time: 0.0182 memory: 6646 grad_norm: 6.5740 loss: 0.5904 decode.loss_ce: 0.3894 decode.acc_seg: 88.5815 aux.loss_ce: 0.2010 aux.acc_seg: 87.4153 2024/10/25 23:02:47 - mmengine - INFO - Iter(train) [47200/80000] base_lr: 1.1066e-04 lr: 1.1066e-04 eta: 17:49:57 time: 1.9451 data_time: 0.0176 memory: 6645 grad_norm: 5.4491 loss: 0.6199 decode.loss_ce: 0.4099 decode.acc_seg: 85.9704 aux.loss_ce: 0.2100 aux.acc_seg: 86.6971 2024/10/25 23:04:24 - mmengine - INFO - Iter(train) [47250/80000] base_lr: 1.1054e-04 lr: 1.1054e-04 eta: 17:48:19 time: 1.9514 data_time: 0.0194 memory: 6646 grad_norm: 6.0585 loss: 0.5732 decode.loss_ce: 0.3744 decode.acc_seg: 86.4638 aux.loss_ce: 0.1987 aux.acc_seg: 82.4996 2024/10/25 23:06:02 - mmengine - INFO - Iter(train) [47300/80000] base_lr: 1.1041e-04 lr: 1.1041e-04 eta: 17:46:41 time: 1.9514 data_time: 0.0185 memory: 6646 grad_norm: 6.5166 loss: 0.6676 decode.loss_ce: 0.4424 decode.acc_seg: 84.3760 aux.loss_ce: 0.2252 aux.acc_seg: 79.8994 2024/10/25 23:07:43 - mmengine - INFO - Iter(train) [47350/80000] base_lr: 1.1028e-04 lr: 1.1028e-04 eta: 17:45:05 time: 1.9539 data_time: 0.0191 memory: 6645 grad_norm: 5.7054 loss: 0.6212 decode.loss_ce: 0.4222 decode.acc_seg: 81.5847 aux.loss_ce: 0.1989 aux.acc_seg: 83.9110 2024/10/25 23:09:21 - mmengine - INFO - Iter(train) [47400/80000] base_lr: 1.1015e-04 lr: 1.1015e-04 eta: 17:43:27 time: 1.9432 data_time: 0.0192 memory: 6649 grad_norm: 4.5049 loss: 0.5469 decode.loss_ce: 0.3697 decode.acc_seg: 86.1710 aux.loss_ce: 0.1772 aux.acc_seg: 85.9552 2024/10/25 23:10:58 - mmengine - INFO - Iter(train) [47450/80000] base_lr: 1.1002e-04 lr: 1.1002e-04 eta: 17:41:49 time: 1.9470 data_time: 0.0189 memory: 6645 grad_norm: 4.5401 loss: 0.5517 decode.loss_ce: 0.3670 decode.acc_seg: 88.8441 aux.loss_ce: 0.1847 aux.acc_seg: 90.0230 2024/10/25 23:12:36 - mmengine - INFO - Iter(train) [47500/80000] base_lr: 1.0989e-04 lr: 1.0989e-04 eta: 17:40:11 time: 1.9674 data_time: 0.0198 memory: 6646 grad_norm: 5.0076 loss: 0.6441 decode.loss_ce: 0.4285 decode.acc_seg: 74.2534 aux.loss_ce: 0.2156 aux.acc_seg: 71.8061 2024/10/25 23:14:13 - mmengine - INFO - Iter(train) [47550/80000] base_lr: 1.0976e-04 lr: 1.0976e-04 eta: 17:38:32 time: 1.9499 data_time: 0.0182 memory: 6645 grad_norm: 5.1510 loss: 0.7117 decode.loss_ce: 0.4923 decode.acc_seg: 75.1630 aux.loss_ce: 0.2194 aux.acc_seg: 74.6390 2024/10/25 23:15:51 - mmengine - INFO - Iter(train) [47600/80000] base_lr: 1.0963e-04 lr: 1.0963e-04 eta: 17:36:55 time: 1.9601 data_time: 0.0201 memory: 6648 grad_norm: 5.4860 loss: 0.6171 decode.loss_ce: 0.3943 decode.acc_seg: 87.7658 aux.loss_ce: 0.2228 aux.acc_seg: 78.9685 2024/10/25 23:17:28 - mmengine - INFO - Iter(train) [47650/80000] base_lr: 1.0949e-04 lr: 1.0949e-04 eta: 17:35:16 time: 1.9529 data_time: 0.0203 memory: 6646 grad_norm: 5.7536 loss: 0.5110 decode.loss_ce: 0.3408 decode.acc_seg: 81.3331 aux.loss_ce: 0.1702 aux.acc_seg: 76.8483 2024/10/25 23:19:06 - mmengine - INFO - Iter(train) [47700/80000] base_lr: 1.0936e-04 lr: 1.0936e-04 eta: 17:33:38 time: 1.9544 data_time: 0.0192 memory: 6645 grad_norm: 5.7076 loss: 0.6127 decode.loss_ce: 0.4155 decode.acc_seg: 87.5002 aux.loss_ce: 0.1972 aux.acc_seg: 87.7042 2024/10/25 23:20:44 - mmengine - INFO - Iter(train) [47750/80000] base_lr: 1.0923e-04 lr: 1.0923e-04 eta: 17:32:01 time: 1.9487 data_time: 0.0183 memory: 6645 grad_norm: 5.1349 loss: 0.6032 decode.loss_ce: 0.4020 decode.acc_seg: 87.6270 aux.loss_ce: 0.2012 aux.acc_seg: 87.5664 2024/10/25 23:22:22 - mmengine - INFO - Iter(train) [47800/80000] base_lr: 1.0909e-04 lr: 1.0909e-04 eta: 17:30:23 time: 1.9416 data_time: 0.0182 memory: 6646 grad_norm: 4.9592 loss: 0.6300 decode.loss_ce: 0.4393 decode.acc_seg: 89.5111 aux.loss_ce: 0.1906 aux.acc_seg: 90.1218 2024/10/25 23:23:59 - mmengine - INFO - Iter(train) [47850/80000] base_lr: 1.0896e-04 lr: 1.0896e-04 eta: 17:28:45 time: 1.9537 data_time: 0.0195 memory: 6646 grad_norm: 7.0616 loss: 0.6624 decode.loss_ce: 0.4468 decode.acc_seg: 82.8548 aux.loss_ce: 0.2156 aux.acc_seg: 83.4993 2024/10/25 23:25:37 - mmengine - INFO - Iter(train) [47900/80000] base_lr: 1.0882e-04 lr: 1.0882e-04 eta: 17:27:07 time: 1.9543 data_time: 0.0185 memory: 6645 grad_norm: 4.5371 loss: 0.5790 decode.loss_ce: 0.3852 decode.acc_seg: 83.1202 aux.loss_ce: 0.1939 aux.acc_seg: 84.4410 2024/10/25 23:27:15 - mmengine - INFO - Iter(train) [47950/80000] base_lr: 1.0868e-04 lr: 1.0868e-04 eta: 17:25:29 time: 1.9487 data_time: 0.0187 memory: 6646 grad_norm: 7.2278 loss: 0.5831 decode.loss_ce: 0.3911 decode.acc_seg: 83.1290 aux.loss_ce: 0.1920 aux.acc_seg: 81.9217 2024/10/25 23:28:52 - mmengine - INFO - Exp name: deeplabv3_mobilemamba_b4-80k_ade20k-512x512_20241024_212005 2024/10/25 23:28:52 - mmengine - INFO - Iter(train) [48000/80000] base_lr: 1.0854e-04 lr: 1.0854e-04 eta: 17:23:51 time: 1.9477 data_time: 0.0195 memory: 6645 grad_norm: 7.1400 loss: 0.6874 decode.loss_ce: 0.4638 decode.acc_seg: 85.6067 aux.loss_ce: 0.2235 aux.acc_seg: 83.8762 2024/10/25 23:28:52 - mmengine - INFO - Saving checkpoint at 48000 iterations 2024/10/25 23:28:56 - mmengine - INFO - Iter(val) [ 50/500] eta: 0:00:15 time: 0.0327 data_time: 0.0017 memory: 1049 2024/10/25 23:28:58 - mmengine - INFO - Iter(val) [100/500] eta: 0:00:13 time: 0.0320 data_time: 0.0016 memory: 1117 2024/10/25 23:29:00 - mmengine - INFO - Iter(val) [150/500] eta: 0:00:11 time: 0.0335 data_time: 0.0017 memory: 833 2024/10/25 23:29:01 - mmengine - INFO - Iter(val) [200/500] eta: 0:00:09 time: 0.0327 data_time: 0.0016 memory: 866 2024/10/25 23:29:03 - mmengine - INFO - Iter(val) [250/500] eta: 0:00:08 time: 0.0341 data_time: 0.0020 memory: 906 2024/10/25 23:29:05 - mmengine - INFO - Iter(val) [300/500] eta: 0:00:06 time: 0.0338 data_time: 0.0020 memory: 2028 2024/10/25 23:29:07 - mmengine - INFO - Iter(val) [350/500] eta: 0:00:05 time: 0.0333 data_time: 0.0016 memory: 832 2024/10/25 23:29:08 - mmengine - INFO - Iter(val) [400/500] eta: 0:00:03 time: 0.0329 data_time: 0.0018 memory: 904 2024/10/25 23:29:10 - mmengine - INFO - Iter(val) [450/500] eta: 0:00:01 time: 0.0334 data_time: 0.0016 memory: 839 2024/10/25 23:29:12 - mmengine - INFO - Iter(val) [500/500] eta: 0:00:00 time: 0.0330 data_time: 0.0015 memory: 889 2024/10/25 23:29:14 - mmengine - INFO - per class results: 2024/10/25 23:29:14 - mmengine - INFO - +---------------------+-------+-------+ | Class | IoU | Acc | +---------------------+-------+-------+ | wall | 66.11 | 83.53 | | building | 74.92 | 85.98 | | sky | 88.1 | 93.95 | | floor | 69.66 | 86.78 | | tree | 65.82 | 83.54 | | ceiling | 76.33 | 90.11 | | road | 76.05 | 84.55 | | bed | 77.29 | 87.77 | | windowpane | 51.28 | 69.28 | | grass | 57.93 | 84.65 | | cabinet | 48.35 | 63.09 | | sidewalk | 53.6 | 72.68 | | person | 58.13 | 76.17 | | earth | 27.52 | 39.32 | | door | 35.08 | 48.6 | | table | 42.24 | 60.45 | | mountain | 46.46 | 58.76 | | plant | 40.78 | 51.84 | | curtain | 53.63 | 67.93 | | chair | 37.28 | 50.87 | | car | 67.34 | 81.39 | | water | 44.82 | 60.11 | | painting | 52.03 | 66.77 | | sofa | 53.25 | 68.81 | | shelf | 26.49 | 36.79 | | house | 39.15 | 65.81 | | sea | 44.67 | 63.94 | | mirror | 50.24 | 58.37 | | rug | 38.69 | 43.22 | | field | 20.9 | 28.33 | | armchair | 32.43 | 50.73 | | seat | 47.36 | 75.11 | | fence | 29.83 | 37.03 | | desk | 36.13 | 51.59 | | rock | 26.74 | 45.69 | | wardrobe | 40.05 | 63.2 | | lamp | 28.29 | 35.39 | | bathtub | 67.86 | 74.98 | | railing | 24.24 | 37.42 | | cushion | 34.58 | 45.42 | | base | 18.67 | 27.99 | | box | 12.26 | 17.24 | | column | 24.31 | 29.12 | | signboard | 19.96 | 31.2 | | chest of drawers | 33.57 | 61.05 | | counter | 33.65 | 43.42 | | sand | 37.7 | 53.42 | | sink | 49.3 | 61.7 | | skyscraper | 55.35 | 81.71 | | fireplace | 63.5 | 76.17 | | refrigerator | 63.81 | 77.85 | | grandstand | 34.66 | 57.0 | | path | 16.95 | 23.57 | | stairs | 16.0 | 21.98 | | runway | 63.81 | 77.88 | | case | 39.89 | 58.31 | | pool table | 78.46 | 85.66 | | pillow | 34.75 | 42.67 | | screen door | 63.05 | 70.59 | | stairway | 20.26 | 30.0 | | river | 8.55 | 16.18 | | bridge | 44.18 | 50.02 | | bookcase | 22.52 | 29.71 | | blind | 27.71 | 36.26 | | coffee table | 46.93 | 58.47 | | toilet | 63.9 | 73.14 | | flower | 22.16 | 30.67 | | book | 32.4 | 47.27 | | hill | 7.9 | 14.1 | | bench | 29.03 | 40.79 | | countertop | 42.17 | 52.74 | | stove | 52.8 | 58.95 | | palm | 25.02 | 30.8 | | kitchen island | 26.26 | 41.6 | | computer | 42.7 | 52.43 | | swivel chair | 35.12 | 45.92 | | boat | 34.79 | 47.34 | | bar | 26.62 | 29.12 | | arcade machine | 33.65 | 52.82 | | hovel | 36.02 | 46.3 | | bus | 76.76 | 86.69 | | towel | 39.66 | 48.54 | | light | 5.15 | 5.29 | | truck | 21.23 | 41.38 | | tower | 30.46 | 63.75 | | chandelier | 46.12 | 61.46 | | awning | 12.34 | 14.89 | | streetlight | 4.97 | 6.06 | | booth | 38.04 | 39.79 | | television receiver | 46.24 | 61.2 | | airplane | 39.04 | 42.91 | | dirt track | 22.23 | 37.2 | | apparel | 26.95 | 39.58 | | pole | 5.52 | 6.7 | | land | 0.17 | 0.29 | | bannister | 0.82 | 0.99 | | escalator | 37.59 | 53.45 | | ottoman | 28.02 | 46.01 | | bottle | 5.1 | 5.62 | | buffet | 40.23 | 45.09 | | poster | 21.61 | 27.97 | | stage | 6.7 | 11.93 | | van | 37.12 | 41.35 | | ship | 4.68 | 5.38 | | fountain | 13.65 | 13.84 | | conveyer belt | 53.94 | 63.53 | | canopy | 15.64 | 17.08 | | washer | 57.92 | 67.23 | | plaything | 11.76 | 17.43 | | swimming pool | 47.4 | 49.04 | | stool | 15.38 | 20.28 | | barrel | 35.36 | 63.27 | | basket | 9.29 | 10.98 | | waterfall | 61.23 | 74.78 | | tent | 83.58 | 87.76 | | bag | 4.6 | 5.59 | | minibike | 41.58 | 60.17 | | cradle | 46.58 | 81.1 | | oven | 29.01 | 55.19 | | ball | 33.63 | 42.92 | | food | 40.54 | 61.28 | | step | 4.82 | 5.64 | | tank | 33.81 | 34.51 | | trade name | 8.69 | 9.81 | | microwave | 42.75 | 47.69 | | pot | 18.36 | 21.18 | | animal | 39.43 | 42.27 | | bicycle | 27.97 | 48.2 | | lake | 4.73 | 8.71 | | dishwasher | 38.92 | 45.41 | | screen | 46.21 | 64.89 | | blanket | 1.45 | 1.85 | | sculpture | 31.2 | 42.76 | | hood | 29.35 | 30.67 | | sconce | 8.87 | 10.08 | | vase | 15.21 | 20.21 | | traffic light | 8.85 | 15.38 | | tray | 1.48 | 3.42 | | ashcan | 22.73 | 31.24 | | fan | 23.77 | 29.5 | | pier | 40.03 | 42.54 | | crt screen | 0.0 | 0.0 | | plate | 17.03 | 24.12 | | monitor | 1.47 | 1.59 | | bulletin board | 26.42 | 30.51 | | shower | 2.46 | 3.65 | | radiator | 28.96 | 32.21 | | glass | 0.96 | 0.99 | | clock | 5.49 | 6.13 | | flag | 12.86 | 14.75 | +---------------------+-------+-------+ 2024/10/25 23:29:14 - mmengine - INFO - Iter(val) [500/500] aAcc: 74.9300 mIoU: 34.2300 mAcc: 44.4500 data_time: 0.0018 time: 0.0334 2024/10/25 23:30:51 - mmengine - INFO - Iter(train) [48050/80000] base_lr: 1.0840e-04 lr: 1.0840e-04 eta: 17:22:14 time: 1.9661 data_time: 0.0171 memory: 6646 grad_norm: 4.6223 loss: 0.5638 decode.loss_ce: 0.3683 decode.acc_seg: 84.1489 aux.loss_ce: 0.1955 aux.acc_seg: 83.7808 2024/10/25 23:32:29 - mmengine - INFO - Iter(train) [48100/80000] base_lr: 1.0827e-04 lr: 1.0827e-04 eta: 17:20:36 time: 1.9508 data_time: 0.0189 memory: 6646 grad_norm: 4.4396 loss: 0.5936 decode.loss_ce: 0.3810 decode.acc_seg: 90.3609 aux.loss_ce: 0.2127 aux.acc_seg: 88.8988 2024/10/25 23:34:07 - mmengine - INFO - Iter(train) [48150/80000] base_lr: 1.0812e-04 lr: 1.0812e-04 eta: 17:18:58 time: 1.9467 data_time: 0.0188 memory: 6645 grad_norm: 5.3143 loss: 0.6036 decode.loss_ce: 0.3877 decode.acc_seg: 93.4032 aux.loss_ce: 0.2159 aux.acc_seg: 79.2889 2024/10/25 23:35:44 - mmengine - INFO - Iter(train) [48200/80000] base_lr: 1.0798e-04 lr: 1.0798e-04 eta: 17:17:20 time: 1.9566 data_time: 0.0187 memory: 6644 grad_norm: 4.9259 loss: 0.6159 decode.loss_ce: 0.4182 decode.acc_seg: 78.3643 aux.loss_ce: 0.1976 aux.acc_seg: 72.4723 2024/10/25 23:37:22 - mmengine - INFO - Iter(train) [48250/80000] base_lr: 1.0784e-04 lr: 1.0784e-04 eta: 17:15:42 time: 1.9602 data_time: 0.0194 memory: 6645 grad_norm: 6.0253 loss: 0.5457 decode.loss_ce: 0.3608 decode.acc_seg: 84.3579 aux.loss_ce: 0.1849 aux.acc_seg: 75.6328 2024/10/25 23:38:59 - mmengine - INFO - Iter(train) [48300/80000] base_lr: 1.0770e-04 lr: 1.0770e-04 eta: 17:14:04 time: 1.9571 data_time: 0.0179 memory: 6646 grad_norm: 5.1080 loss: 0.6843 decode.loss_ce: 0.4462 decode.acc_seg: 84.3894 aux.loss_ce: 0.2381 aux.acc_seg: 82.8522 2024/10/25 23:40:38 - mmengine - INFO - Iter(train) [48350/80000] base_lr: 1.0756e-04 lr: 1.0756e-04 eta: 17:12:26 time: 1.9525 data_time: 0.0196 memory: 6645 grad_norm: 7.8386 loss: 0.5780 decode.loss_ce: 0.3696 decode.acc_seg: 91.6217 aux.loss_ce: 0.2084 aux.acc_seg: 89.9634 2024/10/25 23:42:16 - mmengine - INFO - Iter(train) [48400/80000] base_lr: 1.0741e-04 lr: 1.0741e-04 eta: 17:10:48 time: 1.9591 data_time: 0.0190 memory: 6646 grad_norm: 5.0031 loss: 0.5378 decode.loss_ce: 0.3598 decode.acc_seg: 83.4322 aux.loss_ce: 0.1780 aux.acc_seg: 83.1205 2024/10/25 23:43:54 - mmengine - INFO - Iter(train) [48450/80000] base_lr: 1.0727e-04 lr: 1.0727e-04 eta: 17:09:11 time: 1.9467 data_time: 0.0185 memory: 6646 grad_norm: 5.6976 loss: 0.5903 decode.loss_ce: 0.3977 decode.acc_seg: 81.8047 aux.loss_ce: 0.1927 aux.acc_seg: 80.2494 2024/10/25 23:45:31 - mmengine - INFO - Iter(train) [48500/80000] base_lr: 1.0712e-04 lr: 1.0712e-04 eta: 17:07:32 time: 1.9457 data_time: 0.0198 memory: 6645 grad_norm: 6.8181 loss: 0.5471 decode.loss_ce: 0.3716 decode.acc_seg: 82.8220 aux.loss_ce: 0.1755 aux.acc_seg: 82.2093 2024/10/25 23:47:08 - mmengine - INFO - Iter(train) [48550/80000] base_lr: 1.0698e-04 lr: 1.0698e-04 eta: 17:05:54 time: 1.9392 data_time: 0.0193 memory: 6646 grad_norm: 5.2645 loss: 0.6085 decode.loss_ce: 0.3997 decode.acc_seg: 86.4006 aux.loss_ce: 0.2089 aux.acc_seg: 83.1007 2024/10/25 23:48:46 - mmengine - INFO - Iter(train) [48600/80000] base_lr: 1.0683e-04 lr: 1.0683e-04 eta: 17:04:16 time: 1.9450 data_time: 0.0174 memory: 6645 grad_norm: 5.9496 loss: 0.5344 decode.loss_ce: 0.3573 decode.acc_seg: 91.6901 aux.loss_ce: 0.1770 aux.acc_seg: 88.6710 2024/10/25 23:50:24 - mmengine - INFO - Iter(train) [48650/80000] base_lr: 1.0668e-04 lr: 1.0668e-04 eta: 17:02:38 time: 1.9482 data_time: 0.0187 memory: 6645 grad_norm: 4.1745 loss: 0.5879 decode.loss_ce: 0.3808 decode.acc_seg: 79.5549 aux.loss_ce: 0.2072 aux.acc_seg: 70.4687 2024/10/25 23:52:02 - mmengine - INFO - Iter(train) [48700/80000] base_lr: 1.0653e-04 lr: 1.0653e-04 eta: 17:01:00 time: 1.9497 data_time: 0.0191 memory: 6645 grad_norm: 5.3530 loss: 0.5780 decode.loss_ce: 0.3804 decode.acc_seg: 86.5964 aux.loss_ce: 0.1976 aux.acc_seg: 83.7354 2024/10/25 23:53:43 - mmengine - INFO - Iter(train) [48750/80000] base_lr: 1.0638e-04 lr: 1.0638e-04 eta: 16:59:25 time: 1.9549 data_time: 0.0192 memory: 6645 grad_norm: 4.8883 loss: 0.5546 decode.loss_ce: 0.3544 decode.acc_seg: 81.8543 aux.loss_ce: 0.2002 aux.acc_seg: 82.5087 2024/10/25 23:55:22 - mmengine - INFO - Iter(train) [48800/80000] base_lr: 1.0623e-04 lr: 1.0623e-04 eta: 16:57:47 time: 1.9690 data_time: 0.0175 memory: 6645 grad_norm: 5.0520 loss: 0.5664 decode.loss_ce: 0.3742 decode.acc_seg: 87.1833 aux.loss_ce: 0.1922 aux.acc_seg: 82.7256 2024/10/25 23:56:59 - mmengine - INFO - Iter(train) [48850/80000] base_lr: 1.0608e-04 lr: 1.0608e-04 eta: 16:56:09 time: 1.9486 data_time: 0.0165 memory: 6645 grad_norm: 5.1897 loss: 0.5579 decode.loss_ce: 0.3640 decode.acc_seg: 88.1151 aux.loss_ce: 0.1939 aux.acc_seg: 84.6294 2024/10/25 23:58:37 - mmengine - INFO - Iter(train) [48900/80000] base_lr: 1.0593e-04 lr: 1.0593e-04 eta: 16:54:32 time: 1.9549 data_time: 0.0193 memory: 6646 grad_norm: 4.7072 loss: 0.5476 decode.loss_ce: 0.3598 decode.acc_seg: 85.6193 aux.loss_ce: 0.1879 aux.acc_seg: 84.9565 2024/10/26 00:00:15 - mmengine - INFO - Iter(train) [48950/80000] base_lr: 1.0578e-04 lr: 1.0578e-04 eta: 16:52:54 time: 1.9572 data_time: 0.0182 memory: 6646 grad_norm: 5.2836 loss: 0.5975 decode.loss_ce: 0.3919 decode.acc_seg: 73.4342 aux.loss_ce: 0.2056 aux.acc_seg: 72.0592 2024/10/26 00:01:53 - mmengine - INFO - Exp name: deeplabv3_mobilemamba_b4-80k_ade20k-512x512_20241024_212005 2024/10/26 00:01:53 - mmengine - INFO - Iter(train) [49000/80000] base_lr: 1.0563e-04 lr: 1.0563e-04 eta: 16:51:16 time: 1.9602 data_time: 0.0192 memory: 6646 grad_norm: 4.4233 loss: 0.6035 decode.loss_ce: 0.3969 decode.acc_seg: 83.9673 aux.loss_ce: 0.2066 aux.acc_seg: 69.9478 2024/10/26 00:03:31 - mmengine - INFO - Iter(train) [49050/80000] base_lr: 1.0547e-04 lr: 1.0547e-04 eta: 16:49:38 time: 1.9521 data_time: 0.0194 memory: 6646 grad_norm: 4.8843 loss: 0.5534 decode.loss_ce: 0.3709 decode.acc_seg: 80.5090 aux.loss_ce: 0.1825 aux.acc_seg: 75.9912 2024/10/26 00:05:09 - mmengine - INFO - Iter(train) [49100/80000] base_lr: 1.0532e-04 lr: 1.0532e-04 eta: 16:48:00 time: 1.9536 data_time: 0.0189 memory: 6646 grad_norm: 5.1739 loss: 0.5867 decode.loss_ce: 0.3991 decode.acc_seg: 87.9982 aux.loss_ce: 0.1875 aux.acc_seg: 86.4360 2024/10/26 00:06:46 - mmengine - INFO - Iter(train) [49150/80000] base_lr: 1.0517e-04 lr: 1.0517e-04 eta: 16:46:22 time: 1.9457 data_time: 0.0195 memory: 6646 grad_norm: 7.6282 loss: 0.5360 decode.loss_ce: 0.3545 decode.acc_seg: 84.2513 aux.loss_ce: 0.1815 aux.acc_seg: 81.0262 2024/10/26 00:08:24 - mmengine - INFO - Iter(train) [49200/80000] base_lr: 1.0501e-04 lr: 1.0501e-04 eta: 16:44:44 time: 1.9673 data_time: 0.0172 memory: 6645 grad_norm: 5.2025 loss: 0.5531 decode.loss_ce: 0.3664 decode.acc_seg: 83.7897 aux.loss_ce: 0.1867 aux.acc_seg: 80.1168 2024/10/26 00:10:02 - mmengine - INFO - Iter(train) [49250/80000] base_lr: 1.0485e-04 lr: 1.0485e-04 eta: 16:43:06 time: 1.9535 data_time: 0.0191 memory: 6645 grad_norm: 8.4421 loss: 0.7024 decode.loss_ce: 0.4675 decode.acc_seg: 84.8152 aux.loss_ce: 0.2349 aux.acc_seg: 80.6254 2024/10/26 00:11:43 - mmengine - INFO - Iter(train) [49300/80000] base_lr: 1.0470e-04 lr: 1.0470e-04 eta: 16:41:30 time: 1.9504 data_time: 0.0203 memory: 6646 grad_norm: 4.9774 loss: 0.5982 decode.loss_ce: 0.4014 decode.acc_seg: 82.9638 aux.loss_ce: 0.1968 aux.acc_seg: 78.0275 2024/10/26 00:13:20 - mmengine - INFO - Iter(train) [49350/80000] base_lr: 1.0454e-04 lr: 1.0454e-04 eta: 16:39:52 time: 1.9540 data_time: 0.0183 memory: 6645 grad_norm: 4.9589 loss: 0.6065 decode.loss_ce: 0.4068 decode.acc_seg: 74.1296 aux.loss_ce: 0.1997 aux.acc_seg: 73.4254 2024/10/26 00:14:58 - mmengine - INFO - Iter(train) [49400/80000] base_lr: 1.0438e-04 lr: 1.0438e-04 eta: 16:38:14 time: 1.9435 data_time: 0.0188 memory: 6647 grad_norm: 4.6769 loss: 0.5582 decode.loss_ce: 0.3740 decode.acc_seg: 86.8639 aux.loss_ce: 0.1841 aux.acc_seg: 85.0285 2024/10/26 00:16:36 - mmengine - INFO - Iter(train) [49450/80000] base_lr: 1.0422e-04 lr: 1.0422e-04 eta: 16:36:36 time: 1.9470 data_time: 0.0187 memory: 6646 grad_norm: 4.6255 loss: 0.6279 decode.loss_ce: 0.4251 decode.acc_seg: 86.3520 aux.loss_ce: 0.2027 aux.acc_seg: 82.8294 2024/10/26 00:18:14 - mmengine - INFO - Iter(train) [49500/80000] base_lr: 1.0406e-04 lr: 1.0406e-04 eta: 16:34:58 time: 1.9547 data_time: 0.0195 memory: 6645 grad_norm: 6.8277 loss: 0.6352 decode.loss_ce: 0.4181 decode.acc_seg: 81.6620 aux.loss_ce: 0.2171 aux.acc_seg: 74.8032 2024/10/26 00:19:52 - mmengine - INFO - Iter(train) [49550/80000] base_lr: 1.0390e-04 lr: 1.0390e-04 eta: 16:33:21 time: 1.9486 data_time: 0.0196 memory: 6645 grad_norm: 8.7374 loss: 0.6173 decode.loss_ce: 0.4115 decode.acc_seg: 83.3488 aux.loss_ce: 0.2059 aux.acc_seg: 80.4582 2024/10/26 00:21:30 - mmengine - INFO - Iter(train) [49600/80000] base_lr: 1.0374e-04 lr: 1.0374e-04 eta: 16:31:43 time: 1.9522 data_time: 0.0189 memory: 6644 grad_norm: 4.5995 loss: 0.5982 decode.loss_ce: 0.3885 decode.acc_seg: 85.9220 aux.loss_ce: 0.2097 aux.acc_seg: 74.1609 2024/10/26 00:23:07 - mmengine - INFO - Iter(train) [49650/80000] base_lr: 1.0358e-04 lr: 1.0358e-04 eta: 16:30:05 time: 1.9480 data_time: 0.0185 memory: 6646 grad_norm: 5.0489 loss: 0.4942 decode.loss_ce: 0.3268 decode.acc_seg: 84.2211 aux.loss_ce: 0.1674 aux.acc_seg: 82.8136 2024/10/26 00:24:45 - mmengine - INFO - Iter(train) [49700/80000] base_lr: 1.0342e-04 lr: 1.0342e-04 eta: 16:28:27 time: 1.9461 data_time: 0.0189 memory: 6645 grad_norm: 5.0353 loss: 0.5537 decode.loss_ce: 0.3713 decode.acc_seg: 86.4659 aux.loss_ce: 0.1824 aux.acc_seg: 87.0933 2024/10/26 00:26:22 - mmengine - INFO - Iter(train) [49750/80000] base_lr: 1.0325e-04 lr: 1.0325e-04 eta: 16:26:49 time: 1.9423 data_time: 0.0191 memory: 6646 grad_norm: 6.7235 loss: 0.5549 decode.loss_ce: 0.3584 decode.acc_seg: 87.2772 aux.loss_ce: 0.1964 aux.acc_seg: 85.0164 2024/10/26 00:28:00 - mmengine - INFO - Iter(train) [49800/80000] base_lr: 1.0309e-04 lr: 1.0309e-04 eta: 16:25:11 time: 1.9495 data_time: 0.0178 memory: 6645 grad_norm: 3.6387 loss: 0.5635 decode.loss_ce: 0.3771 decode.acc_seg: 89.5667 aux.loss_ce: 0.1863 aux.acc_seg: 88.1284 2024/10/26 00:29:38 - mmengine - INFO - Iter(train) [49850/80000] base_lr: 1.0293e-04 lr: 1.0293e-04 eta: 16:23:33 time: 1.9544 data_time: 0.0180 memory: 6645 grad_norm: 5.5831 loss: 0.5247 decode.loss_ce: 0.3518 decode.acc_seg: 85.1297 aux.loss_ce: 0.1729 aux.acc_seg: 85.3727 2024/10/26 00:31:16 - mmengine - INFO - Iter(train) [49900/80000] base_lr: 1.0276e-04 lr: 1.0276e-04 eta: 16:21:55 time: 1.9477 data_time: 0.0186 memory: 6646 grad_norm: 4.5288 loss: 0.5636 decode.loss_ce: 0.3706 decode.acc_seg: 84.6046 aux.loss_ce: 0.1930 aux.acc_seg: 82.9527 2024/10/26 00:32:54 - mmengine - INFO - Iter(train) [49950/80000] base_lr: 1.0260e-04 lr: 1.0260e-04 eta: 16:20:17 time: 1.9636 data_time: 0.0188 memory: 6645 grad_norm: 5.8049 loss: 0.6432 decode.loss_ce: 0.4155 decode.acc_seg: 89.0028 aux.loss_ce: 0.2277 aux.acc_seg: 86.6113 2024/10/26 00:34:31 - mmengine - INFO - Exp name: deeplabv3_mobilemamba_b4-80k_ade20k-512x512_20241024_212005 2024/10/26 00:34:31 - mmengine - INFO - Iter(train) [50000/80000] base_lr: 1.0243e-04 lr: 1.0243e-04 eta: 16:18:39 time: 1.9481 data_time: 0.0190 memory: 6645 grad_norm: 5.3287 loss: 0.6060 decode.loss_ce: 0.4058 decode.acc_seg: 81.6038 aux.loss_ce: 0.2002 aux.acc_seg: 80.2562 2024/10/26 00:36:09 - mmengine - INFO - Iter(train) [50050/80000] base_lr: 1.0226e-04 lr: 1.0226e-04 eta: 16:17:01 time: 1.9480 data_time: 0.0193 memory: 6645 grad_norm: 4.1654 loss: 0.5863 decode.loss_ce: 0.3878 decode.acc_seg: 91.4941 aux.loss_ce: 0.1985 aux.acc_seg: 84.6087 2024/10/26 00:37:46 - mmengine - INFO - Iter(train) [50100/80000] base_lr: 1.0210e-04 lr: 1.0210e-04 eta: 16:15:23 time: 1.9434 data_time: 0.0195 memory: 6645 grad_norm: 4.7100 loss: 0.5982 decode.loss_ce: 0.3978 decode.acc_seg: 77.6223 aux.loss_ce: 0.2004 aux.acc_seg: 78.3445 2024/10/26 00:39:24 - mmengine - INFO - Iter(train) [50150/80000] base_lr: 1.0193e-04 lr: 1.0193e-04 eta: 16:13:45 time: 1.9542 data_time: 0.0172 memory: 6645 grad_norm: 5.0228 loss: 0.6692 decode.loss_ce: 0.4576 decode.acc_seg: 73.9221 aux.loss_ce: 0.2116 aux.acc_seg: 72.7357 2024/10/26 00:41:02 - mmengine - INFO - Iter(train) [50200/80000] base_lr: 1.0176e-04 lr: 1.0176e-04 eta: 16:12:07 time: 1.9473 data_time: 0.0193 memory: 6645 grad_norm: 4.4636 loss: 0.5926 decode.loss_ce: 0.3838 decode.acc_seg: 82.8074 aux.loss_ce: 0.2088 aux.acc_seg: 82.0788 2024/10/26 00:42:43 - mmengine - INFO - Iter(train) [50250/80000] base_lr: 1.0159e-04 lr: 1.0159e-04 eta: 16:10:31 time: 1.9541 data_time: 0.0192 memory: 6645 grad_norm: 4.9261 loss: 0.5007 decode.loss_ce: 0.3313 decode.acc_seg: 84.8463 aux.loss_ce: 0.1694 aux.acc_seg: 82.6588 2024/10/26 00:44:21 - mmengine - INFO - Iter(train) [50300/80000] base_lr: 1.0142e-04 lr: 1.0142e-04 eta: 16:08:53 time: 1.9510 data_time: 0.0199 memory: 6646 grad_norm: 9.1048 loss: 0.5601 decode.loss_ce: 0.3654 decode.acc_seg: 84.0303 aux.loss_ce: 0.1948 aux.acc_seg: 79.4498 2024/10/26 00:45:58 - mmengine - INFO - Iter(train) [50350/80000] base_lr: 1.0125e-04 lr: 1.0125e-04 eta: 16:07:15 time: 1.9458 data_time: 0.0189 memory: 6645 grad_norm: 7.3987 loss: 0.4406 decode.loss_ce: 0.2918 decode.acc_seg: 83.9846 aux.loss_ce: 0.1488 aux.acc_seg: 86.4956 2024/10/26 00:47:36 - mmengine - INFO - Iter(train) [50400/80000] base_lr: 1.0108e-04 lr: 1.0108e-04 eta: 16:05:37 time: 1.9464 data_time: 0.0186 memory: 6645 grad_norm: 4.6152 loss: 0.5637 decode.loss_ce: 0.3644 decode.acc_seg: 92.3428 aux.loss_ce: 0.1993 aux.acc_seg: 87.4514 2024/10/26 00:49:14 - mmengine - INFO - Iter(train) [50450/80000] base_lr: 1.0090e-04 lr: 1.0090e-04 eta: 16:03:59 time: 1.9690 data_time: 0.0197 memory: 6646 grad_norm: 4.6659 loss: 0.5172 decode.loss_ce: 0.3342 decode.acc_seg: 84.7587 aux.loss_ce: 0.1830 aux.acc_seg: 82.2330 2024/10/26 00:50:51 - mmengine - INFO - Iter(train) [50500/80000] base_lr: 1.0073e-04 lr: 1.0073e-04 eta: 16:02:21 time: 1.9464 data_time: 0.0192 memory: 6649 grad_norm: 4.1825 loss: 0.6392 decode.loss_ce: 0.4287 decode.acc_seg: 82.6820 aux.loss_ce: 0.2105 aux.acc_seg: 82.2952 2024/10/26 00:52:29 - mmengine - INFO - Iter(train) [50550/80000] base_lr: 1.0056e-04 lr: 1.0056e-04 eta: 16:00:43 time: 1.9501 data_time: 0.0187 memory: 6645 grad_norm: 4.7783 loss: 0.5668 decode.loss_ce: 0.3892 decode.acc_seg: 82.1061 aux.loss_ce: 0.1775 aux.acc_seg: 75.9354 2024/10/26 00:54:06 - mmengine - INFO - Iter(train) [50600/80000] base_lr: 1.0038e-04 lr: 1.0038e-04 eta: 15:59:05 time: 1.9469 data_time: 0.0186 memory: 6646 grad_norm: 5.1020 loss: 0.5683 decode.loss_ce: 0.3804 decode.acc_seg: 82.7480 aux.loss_ce: 0.1879 aux.acc_seg: 79.6389 2024/10/26 00:55:44 - mmengine - INFO - Iter(train) [50650/80000] base_lr: 1.0021e-04 lr: 1.0021e-04 eta: 15:57:27 time: 1.9562 data_time: 0.0185 memory: 6645 grad_norm: 4.5940 loss: 0.5299 decode.loss_ce: 0.3484 decode.acc_seg: 87.4456 aux.loss_ce: 0.1815 aux.acc_seg: 82.4118 2024/10/26 00:57:22 - mmengine - INFO - Iter(train) [50700/80000] base_lr: 1.0003e-04 lr: 1.0003e-04 eta: 15:55:49 time: 1.9481 data_time: 0.0190 memory: 6645 grad_norm: 5.4297 loss: 0.7042 decode.loss_ce: 0.4736 decode.acc_seg: 84.0724 aux.loss_ce: 0.2306 aux.acc_seg: 83.8547 2024/10/26 00:59:00 - mmengine - INFO - Iter(train) [50750/80000] base_lr: 9.9859e-05 lr: 9.9859e-05 eta: 15:54:11 time: 1.9531 data_time: 0.0194 memory: 6646 grad_norm: 5.3592 loss: 0.5362 decode.loss_ce: 0.3486 decode.acc_seg: 87.6594 aux.loss_ce: 0.1876 aux.acc_seg: 86.3027 2024/10/26 01:00:37 - mmengine - INFO - Iter(train) [50800/80000] base_lr: 9.9682e-05 lr: 9.9682e-05 eta: 15:52:33 time: 1.9517 data_time: 0.0187 memory: 6646 grad_norm: 5.8244 loss: 0.6928 decode.loss_ce: 0.4491 decode.acc_seg: 77.9771 aux.loss_ce: 0.2437 aux.acc_seg: 72.7260 2024/10/26 01:02:15 - mmengine - INFO - Iter(train) [50850/80000] base_lr: 9.9505e-05 lr: 9.9505e-05 eta: 15:50:55 time: 1.9494 data_time: 0.0183 memory: 6645 grad_norm: 7.3060 loss: 0.5199 decode.loss_ce: 0.3519 decode.acc_seg: 87.1810 aux.loss_ce: 0.1680 aux.acc_seg: 87.4166 2024/10/26 01:03:53 - mmengine - INFO - Iter(train) [50900/80000] base_lr: 9.9328e-05 lr: 9.9328e-05 eta: 15:49:17 time: 1.9547 data_time: 0.0189 memory: 6645 grad_norm: 4.6360 loss: 0.6064 decode.loss_ce: 0.4089 decode.acc_seg: 86.0870 aux.loss_ce: 0.1975 aux.acc_seg: 86.6173 2024/10/26 01:05:31 - mmengine - INFO - Iter(train) [50950/80000] base_lr: 9.9149e-05 lr: 9.9149e-05 eta: 15:47:39 time: 1.9544 data_time: 0.0178 memory: 6646 grad_norm: 4.6782 loss: 0.5389 decode.loss_ce: 0.3483 decode.acc_seg: 89.1889 aux.loss_ce: 0.1906 aux.acc_seg: 89.4508 2024/10/26 01:07:08 - mmengine - INFO - Exp name: deeplabv3_mobilemamba_b4-80k_ade20k-512x512_20241024_212005 2024/10/26 01:07:08 - mmengine - INFO - Iter(train) [51000/80000] base_lr: 9.8970e-05 lr: 9.8970e-05 eta: 15:46:01 time: 1.9511 data_time: 0.0186 memory: 6645 grad_norm: 5.1293 loss: 0.5280 decode.loss_ce: 0.3570 decode.acc_seg: 83.5046 aux.loss_ce: 0.1710 aux.acc_seg: 81.6943 2024/10/26 01:08:46 - mmengine - INFO - Iter(train) [51050/80000] base_lr: 9.8791e-05 lr: 9.8791e-05 eta: 15:44:24 time: 1.9882 data_time: 0.0195 memory: 6645 grad_norm: 9.5627 loss: 0.8102 decode.loss_ce: 0.5501 decode.acc_seg: 70.7274 aux.loss_ce: 0.2601 aux.acc_seg: 72.0846 2024/10/26 01:10:24 - mmengine - INFO - Iter(train) [51100/80000] base_lr: 9.8611e-05 lr: 9.8611e-05 eta: 15:42:46 time: 1.9544 data_time: 0.0187 memory: 6645 grad_norm: 5.3884 loss: 0.5747 decode.loss_ce: 0.3895 decode.acc_seg: 85.6280 aux.loss_ce: 0.1852 aux.acc_seg: 75.2702 2024/10/26 01:12:02 - mmengine - INFO - Iter(train) [51150/80000] base_lr: 9.8430e-05 lr: 9.8430e-05 eta: 15:41:08 time: 1.9515 data_time: 0.0187 memory: 6646 grad_norm: 6.0437 loss: 0.6554 decode.loss_ce: 0.4314 decode.acc_seg: 82.9612 aux.loss_ce: 0.2240 aux.acc_seg: 79.5947 2024/10/26 01:13:43 - mmengine - INFO - Iter(train) [51200/80000] base_lr: 9.8249e-05 lr: 9.8249e-05 eta: 15:39:32 time: 1.9489 data_time: 0.0187 memory: 6646 grad_norm: 5.3142 loss: 0.6329 decode.loss_ce: 0.4254 decode.acc_seg: 82.3561 aux.loss_ce: 0.2075 aux.acc_seg: 79.5602 2024/10/26 01:15:21 - mmengine - INFO - Iter(train) [51250/80000] base_lr: 9.8067e-05 lr: 9.8067e-05 eta: 15:37:54 time: 1.9560 data_time: 0.0186 memory: 6645 grad_norm: 5.0884 loss: 0.5396 decode.loss_ce: 0.3622 decode.acc_seg: 89.7677 aux.loss_ce: 0.1773 aux.acc_seg: 88.3146 2024/10/26 01:16:58 - mmengine - INFO - Iter(train) [51300/80000] base_lr: 9.7885e-05 lr: 9.7885e-05 eta: 15:36:16 time: 1.9456 data_time: 0.0189 memory: 6645 grad_norm: 4.9795 loss: 0.4838 decode.loss_ce: 0.3248 decode.acc_seg: 87.1604 aux.loss_ce: 0.1590 aux.acc_seg: 81.0166 2024/10/26 01:18:36 - mmengine - INFO - Iter(train) [51350/80000] base_lr: 9.7702e-05 lr: 9.7702e-05 eta: 15:34:38 time: 1.9445 data_time: 0.0191 memory: 6645 grad_norm: 3.9325 loss: 0.5225 decode.loss_ce: 0.3364 decode.acc_seg: 87.8620 aux.loss_ce: 0.1860 aux.acc_seg: 80.9299 2024/10/26 01:20:13 - mmengine - INFO - Iter(train) [51400/80000] base_lr: 9.7518e-05 lr: 9.7518e-05 eta: 15:32:59 time: 1.9475 data_time: 0.0182 memory: 6648 grad_norm: 4.9041 loss: 0.5827 decode.loss_ce: 0.3801 decode.acc_seg: 87.5045 aux.loss_ce: 0.2026 aux.acc_seg: 81.7962 2024/10/26 01:21:51 - mmengine - INFO - Iter(train) [51450/80000] base_lr: 9.7334e-05 lr: 9.7334e-05 eta: 15:31:22 time: 1.9523 data_time: 0.0186 memory: 6645 grad_norm: 4.5219 loss: 0.5421 decode.loss_ce: 0.3590 decode.acc_seg: 89.2082 aux.loss_ce: 0.1831 aux.acc_seg: 87.2401 2024/10/26 01:23:29 - mmengine - INFO - Iter(train) [51500/80000] base_lr: 9.7149e-05 lr: 9.7149e-05 eta: 15:29:44 time: 1.9525 data_time: 0.0186 memory: 6646 grad_norm: 4.6808 loss: 0.5565 decode.loss_ce: 0.3665 decode.acc_seg: 81.6186 aux.loss_ce: 0.1901 aux.acc_seg: 80.4075 2024/10/26 01:25:07 - mmengine - INFO - Iter(train) [51550/80000] base_lr: 9.6964e-05 lr: 9.6964e-05 eta: 15:28:06 time: 1.9507 data_time: 0.0187 memory: 6646 grad_norm: 6.7942 loss: 0.5320 decode.loss_ce: 0.3514 decode.acc_seg: 82.6572 aux.loss_ce: 0.1806 aux.acc_seg: 77.0631 2024/10/26 01:26:44 - mmengine - INFO - Iter(train) [51600/80000] base_lr: 9.6778e-05 lr: 9.6778e-05 eta: 15:26:28 time: 1.9478 data_time: 0.0188 memory: 6645 grad_norm: 4.8911 loss: 0.6055 decode.loss_ce: 0.4095 decode.acc_seg: 83.3515 aux.loss_ce: 0.1960 aux.acc_seg: 83.4771 2024/10/26 01:28:22 - mmengine - INFO - Iter(train) [51650/80000] base_lr: 9.6592e-05 lr: 9.6592e-05 eta: 15:24:50 time: 1.9779 data_time: 0.0178 memory: 6646 grad_norm: 7.3948 loss: 0.6322 decode.loss_ce: 0.4144 decode.acc_seg: 86.1882 aux.loss_ce: 0.2178 aux.acc_seg: 77.2994 2024/10/26 01:30:00 - mmengine - INFO - Iter(train) [51700/80000] base_lr: 9.6405e-05 lr: 9.6405e-05 eta: 15:23:12 time: 1.9625 data_time: 0.0180 memory: 6646 grad_norm: 5.1320 loss: 0.6148 decode.loss_ce: 0.4117 decode.acc_seg: 89.5059 aux.loss_ce: 0.2031 aux.acc_seg: 81.7547 2024/10/26 01:31:38 - mmengine - INFO - Iter(train) [51750/80000] base_lr: 9.6217e-05 lr: 9.6217e-05 eta: 15:21:34 time: 1.9467 data_time: 0.0196 memory: 6646 grad_norm: 5.4428 loss: 0.5374 decode.loss_ce: 0.3588 decode.acc_seg: 84.0712 aux.loss_ce: 0.1786 aux.acc_seg: 78.0775 2024/10/26 01:33:16 - mmengine - INFO - Iter(train) [51800/80000] base_lr: 9.6029e-05 lr: 9.6029e-05 eta: 15:19:56 time: 1.9478 data_time: 0.0193 memory: 6646 grad_norm: 5.8007 loss: 0.6036 decode.loss_ce: 0.3714 decode.acc_seg: 87.3614 aux.loss_ce: 0.2321 aux.acc_seg: 83.3934 2024/10/26 01:34:53 - mmengine - INFO - Iter(train) [51850/80000] base_lr: 9.5840e-05 lr: 9.5840e-05 eta: 15:18:18 time: 1.9459 data_time: 0.0197 memory: 6646 grad_norm: 3.7149 loss: 0.5153 decode.loss_ce: 0.3323 decode.acc_seg: 85.4802 aux.loss_ce: 0.1830 aux.acc_seg: 77.3925 2024/10/26 01:36:31 - mmengine - INFO - Iter(train) [51900/80000] base_lr: 9.5651e-05 lr: 9.5651e-05 eta: 15:16:40 time: 1.9470 data_time: 0.0177 memory: 6646 grad_norm: 4.7171 loss: 0.5662 decode.loss_ce: 0.3847 decode.acc_seg: 78.7556 aux.loss_ce: 0.1816 aux.acc_seg: 76.0984 2024/10/26 01:38:09 - mmengine - INFO - Iter(train) [51950/80000] base_lr: 9.5461e-05 lr: 9.5461e-05 eta: 15:15:02 time: 1.9528 data_time: 0.0186 memory: 6646 grad_norm: 6.0078 loss: 0.5844 decode.loss_ce: 0.3760 decode.acc_seg: 89.9047 aux.loss_ce: 0.2084 aux.acc_seg: 88.9589 2024/10/26 01:39:46 - mmengine - INFO - Exp name: deeplabv3_mobilemamba_b4-80k_ade20k-512x512_20241024_212005 2024/10/26 01:39:46 - mmengine - INFO - Iter(train) [52000/80000] base_lr: 9.5271e-05 lr: 9.5271e-05 eta: 15:13:24 time: 1.9437 data_time: 0.0178 memory: 6645 grad_norm: 4.8598 loss: 0.4780 decode.loss_ce: 0.3223 decode.acc_seg: 92.6935 aux.loss_ce: 0.1558 aux.acc_seg: 93.7805 2024/10/26 01:41:24 - mmengine - INFO - Iter(train) [52050/80000] base_lr: 9.5080e-05 lr: 9.5080e-05 eta: 15:11:47 time: 1.9508 data_time: 0.0178 memory: 6645 grad_norm: 3.9682 loss: 0.5291 decode.loss_ce: 0.3589 decode.acc_seg: 89.8064 aux.loss_ce: 0.1702 aux.acc_seg: 84.4131 2024/10/26 01:43:02 - mmengine - INFO - Iter(train) [52100/80000] base_lr: 9.4889e-05 lr: 9.4889e-05 eta: 15:10:09 time: 1.9460 data_time: 0.0183 memory: 6647 grad_norm: 10.3289 loss: 0.7255 decode.loss_ce: 0.4739 decode.acc_seg: 84.8460 aux.loss_ce: 0.2515 aux.acc_seg: 78.9346 2024/10/26 01:44:43 - mmengine - INFO - Iter(train) [52150/80000] base_lr: 9.4697e-05 lr: 9.4697e-05 eta: 15:08:32 time: 1.9462 data_time: 0.0191 memory: 6645 grad_norm: 5.2238 loss: 0.5313 decode.loss_ce: 0.3545 decode.acc_seg: 87.1330 aux.loss_ce: 0.1768 aux.acc_seg: 76.3230 2024/10/26 01:46:21 - mmengine - INFO - Iter(train) [52200/80000] base_lr: 9.4504e-05 lr: 9.4504e-05 eta: 15:06:54 time: 1.9526 data_time: 0.0193 memory: 6645 grad_norm: 4.4993 loss: 0.5654 decode.loss_ce: 0.3867 decode.acc_seg: 85.8632 aux.loss_ce: 0.1787 aux.acc_seg: 80.3014 2024/10/26 01:47:58 - mmengine - INFO - Iter(train) [52250/80000] base_lr: 9.4311e-05 lr: 9.4311e-05 eta: 15:05:16 time: 1.9476 data_time: 0.0186 memory: 6645 grad_norm: 4.2956 loss: 0.5081 decode.loss_ce: 0.3493 decode.acc_seg: 87.1430 aux.loss_ce: 0.1587 aux.acc_seg: 86.4629 2024/10/26 01:49:36 - mmengine - INFO - Iter(train) [52300/80000] base_lr: 9.4118e-05 lr: 9.4118e-05 eta: 15:03:39 time: 1.9522 data_time: 0.0189 memory: 6645 grad_norm: 4.9955 loss: 0.5394 decode.loss_ce: 0.3482 decode.acc_seg: 85.6028 aux.loss_ce: 0.1912 aux.acc_seg: 79.9004 2024/10/26 01:51:14 - mmengine - INFO - Iter(train) [52350/80000] base_lr: 9.3924e-05 lr: 9.3924e-05 eta: 15:02:01 time: 1.9485 data_time: 0.0188 memory: 6645 grad_norm: 5.0576 loss: 0.5147 decode.loss_ce: 0.3499 decode.acc_seg: 85.2638 aux.loss_ce: 0.1647 aux.acc_seg: 84.2121 2024/10/26 01:52:52 - mmengine - INFO - Iter(train) [52400/80000] base_lr: 9.3729e-05 lr: 9.3729e-05 eta: 15:00:23 time: 1.9492 data_time: 0.0191 memory: 6646 grad_norm: 4.5825 loss: 0.5329 decode.loss_ce: 0.3500 decode.acc_seg: 84.8057 aux.loss_ce: 0.1828 aux.acc_seg: 84.9166 2024/10/26 01:54:30 - mmengine - INFO - Iter(train) [52450/80000] base_lr: 9.3534e-05 lr: 9.3534e-05 eta: 14:58:45 time: 1.9551 data_time: 0.0186 memory: 6648 grad_norm: 6.4988 loss: 0.5342 decode.loss_ce: 0.3445 decode.acc_seg: 90.3146 aux.loss_ce: 0.1897 aux.acc_seg: 84.3207 2024/10/26 01:56:08 - mmengine - INFO - Iter(train) [52500/80000] base_lr: 9.3338e-05 lr: 9.3338e-05 eta: 14:57:07 time: 1.9554 data_time: 0.0187 memory: 6646 grad_norm: 4.5252 loss: 0.5756 decode.loss_ce: 0.3785 decode.acc_seg: 90.6486 aux.loss_ce: 0.1971 aux.acc_seg: 87.3740 2024/10/26 01:57:45 - mmengine - INFO - Iter(train) [52550/80000] base_lr: 9.3142e-05 lr: 9.3142e-05 eta: 14:55:29 time: 1.9484 data_time: 0.0182 memory: 6645 grad_norm: 4.5667 loss: 0.5292 decode.loss_ce: 0.3525 decode.acc_seg: 89.7268 aux.loss_ce: 0.1767 aux.acc_seg: 88.1158 2024/10/26 01:59:22 - mmengine - INFO - Iter(train) [52600/80000] base_lr: 9.2945e-05 lr: 9.2945e-05 eta: 14:53:51 time: 1.9476 data_time: 0.0190 memory: 6645 grad_norm: 4.3953 loss: 0.4907 decode.loss_ce: 0.3287 decode.acc_seg: 87.7475 aux.loss_ce: 0.1621 aux.acc_seg: 89.1290 2024/10/26 02:01:00 - mmengine - INFO - Iter(train) [52650/80000] base_lr: 9.2748e-05 lr: 9.2748e-05 eta: 14:52:13 time: 1.9445 data_time: 0.0188 memory: 6645 grad_norm: 5.3384 loss: 0.5822 decode.loss_ce: 0.3825 decode.acc_seg: 88.9591 aux.loss_ce: 0.1996 aux.acc_seg: 80.7926 2024/10/26 02:02:38 - mmengine - INFO - Iter(train) [52700/80000] base_lr: 9.2550e-05 lr: 9.2550e-05 eta: 14:50:35 time: 1.9435 data_time: 0.0187 memory: 6647 grad_norm: 4.7511 loss: 0.5089 decode.loss_ce: 0.3371 decode.acc_seg: 74.1494 aux.loss_ce: 0.1718 aux.acc_seg: 79.1006 2024/10/26 02:04:16 - mmengine - INFO - Iter(train) [52750/80000] base_lr: 9.2352e-05 lr: 9.2352e-05 eta: 14:48:57 time: 1.9443 data_time: 0.0188 memory: 6646 grad_norm: 5.8464 loss: 0.5920 decode.loss_ce: 0.3915 decode.acc_seg: 82.2502 aux.loss_ce: 0.2005 aux.acc_seg: 79.3883 2024/10/26 02:05:53 - mmengine - INFO - Iter(train) [52800/80000] base_lr: 9.2154e-05 lr: 9.2154e-05 eta: 14:47:19 time: 1.9616 data_time: 0.0182 memory: 6645 grad_norm: 6.3532 loss: 0.6455 decode.loss_ce: 0.4219 decode.acc_seg: 85.1536 aux.loss_ce: 0.2236 aux.acc_seg: 82.1716 2024/10/26 02:07:31 - mmengine - INFO - Iter(train) [52850/80000] base_lr: 9.1954e-05 lr: 9.1954e-05 eta: 14:45:41 time: 1.9580 data_time: 0.0192 memory: 6645 grad_norm: 5.2322 loss: 0.5290 decode.loss_ce: 0.3602 decode.acc_seg: 77.8416 aux.loss_ce: 0.1688 aux.acc_seg: 76.5470 2024/10/26 02:09:09 - mmengine - INFO - Iter(train) [52900/80000] base_lr: 9.1755e-05 lr: 9.1755e-05 eta: 14:44:03 time: 1.9469 data_time: 0.0191 memory: 6646 grad_norm: 5.2261 loss: 0.5764 decode.loss_ce: 0.3687 decode.acc_seg: 78.8930 aux.loss_ce: 0.2077 aux.acc_seg: 74.7941 2024/10/26 02:10:47 - mmengine - INFO - Iter(train) [52950/80000] base_lr: 9.1555e-05 lr: 9.1555e-05 eta: 14:42:25 time: 1.9509 data_time: 0.0177 memory: 6645 grad_norm: 5.6686 loss: 0.4399 decode.loss_ce: 0.2930 decode.acc_seg: 83.9719 aux.loss_ce: 0.1469 aux.acc_seg: 81.5847 2024/10/26 02:12:24 - mmengine - INFO - Exp name: deeplabv3_mobilemamba_b4-80k_ade20k-512x512_20241024_212005 2024/10/26 02:12:24 - mmengine - INFO - Iter(train) [53000/80000] base_lr: 9.1354e-05 lr: 9.1354e-05 eta: 14:40:47 time: 1.9499 data_time: 0.0195 memory: 6646 grad_norm: 5.1177 loss: 0.5442 decode.loss_ce: 0.3671 decode.acc_seg: 77.8342 aux.loss_ce: 0.1771 aux.acc_seg: 74.7156 2024/10/26 02:14:02 - mmengine - INFO - Iter(train) [53050/80000] base_lr: 9.1153e-05 lr: 9.1153e-05 eta: 14:39:09 time: 1.9455 data_time: 0.0192 memory: 6646 grad_norm: 3.9590 loss: 0.5817 decode.loss_ce: 0.3866 decode.acc_seg: 86.6015 aux.loss_ce: 0.1951 aux.acc_seg: 86.1699 2024/10/26 02:15:42 - mmengine - INFO - Iter(train) [53100/80000] base_lr: 9.0951e-05 lr: 9.0951e-05 eta: 14:37:33 time: 1.9485 data_time: 0.0176 memory: 6645 grad_norm: 4.3888 loss: 0.5296 decode.loss_ce: 0.3470 decode.acc_seg: 87.9713 aux.loss_ce: 0.1826 aux.acc_seg: 86.9445 2024/10/26 02:17:20 - mmengine - INFO - Iter(train) [53150/80000] base_lr: 9.0749e-05 lr: 9.0749e-05 eta: 14:35:55 time: 1.9451 data_time: 0.0181 memory: 6646 grad_norm: 5.7102 loss: 0.5794 decode.loss_ce: 0.3766 decode.acc_seg: 85.3612 aux.loss_ce: 0.2029 aux.acc_seg: 81.6667 2024/10/26 02:18:57 - mmengine - INFO - Iter(train) [53200/80000] base_lr: 9.0547e-05 lr: 9.0547e-05 eta: 14:34:17 time: 1.9464 data_time: 0.0188 memory: 6646 grad_norm: 6.2890 loss: 0.4986 decode.loss_ce: 0.3272 decode.acc_seg: 88.8241 aux.loss_ce: 0.1715 aux.acc_seg: 88.4514 2024/10/26 02:20:35 - mmengine - INFO - Iter(train) [53250/80000] base_lr: 9.0344e-05 lr: 9.0344e-05 eta: 14:32:39 time: 1.9628 data_time: 0.0186 memory: 6645 grad_norm: 3.8770 loss: 0.5989 decode.loss_ce: 0.4014 decode.acc_seg: 84.0862 aux.loss_ce: 0.1975 aux.acc_seg: 82.9329 2024/10/26 02:22:13 - mmengine - INFO - Iter(train) [53300/80000] base_lr: 9.0140e-05 lr: 9.0140e-05 eta: 14:31:01 time: 1.9504 data_time: 0.0183 memory: 6647 grad_norm: 6.0634 loss: 0.5806 decode.loss_ce: 0.3891 decode.acc_seg: 84.1942 aux.loss_ce: 0.1915 aux.acc_seg: 84.0755 2024/10/26 02:23:50 - mmengine - INFO - Iter(train) [53350/80000] base_lr: 8.9936e-05 lr: 8.9936e-05 eta: 14:29:23 time: 1.9525 data_time: 0.0195 memory: 6646 grad_norm: 6.5125 loss: 0.5674 decode.loss_ce: 0.3872 decode.acc_seg: 81.1246 aux.loss_ce: 0.1801 aux.acc_seg: 84.6944 2024/10/26 02:25:28 - mmengine - INFO - Iter(train) [53400/80000] base_lr: 8.9732e-05 lr: 8.9732e-05 eta: 14:27:44 time: 1.9413 data_time: 0.0181 memory: 6645 grad_norm: 5.6228 loss: 0.5802 decode.loss_ce: 0.3752 decode.acc_seg: 77.1221 aux.loss_ce: 0.2050 aux.acc_seg: 72.8999 2024/10/26 02:27:05 - mmengine - INFO - Iter(train) [53450/80000] base_lr: 8.9527e-05 lr: 8.9527e-05 eta: 14:26:06 time: 1.9790 data_time: 0.0185 memory: 6645 grad_norm: 3.7908 loss: 0.4673 decode.loss_ce: 0.3080 decode.acc_seg: 87.2713 aux.loss_ce: 0.1593 aux.acc_seg: 86.3244 2024/10/26 02:28:43 - mmengine - INFO - Iter(train) [53500/80000] base_lr: 8.9321e-05 lr: 8.9321e-05 eta: 14:24:28 time: 1.9445 data_time: 0.0182 memory: 6646 grad_norm: 5.6799 loss: 0.5902 decode.loss_ce: 0.3964 decode.acc_seg: 85.5437 aux.loss_ce: 0.1938 aux.acc_seg: 86.2394 2024/10/26 02:30:20 - mmengine - INFO - Iter(train) [53550/80000] base_lr: 8.9116e-05 lr: 8.9116e-05 eta: 14:22:50 time: 1.9545 data_time: 0.0173 memory: 6647 grad_norm: 4.6034 loss: 0.5393 decode.loss_ce: 0.3530 decode.acc_seg: 83.1321 aux.loss_ce: 0.1863 aux.acc_seg: 81.0780 2024/10/26 02:31:58 - mmengine - INFO - Iter(train) [53600/80000] base_lr: 8.8909e-05 lr: 8.8909e-05 eta: 14:21:12 time: 1.9413 data_time: 0.0181 memory: 6645 grad_norm: 5.6670 loss: 0.5878 decode.loss_ce: 0.3969 decode.acc_seg: 89.4321 aux.loss_ce: 0.1908 aux.acc_seg: 87.6023 2024/10/26 02:33:36 - mmengine - INFO - Iter(train) [53650/80000] base_lr: 8.8703e-05 lr: 8.8703e-05 eta: 14:19:34 time: 1.9487 data_time: 0.0192 memory: 6645 grad_norm: 3.7581 loss: 0.5375 decode.loss_ce: 0.3659 decode.acc_seg: 84.6753 aux.loss_ce: 0.1716 aux.acc_seg: 82.7938 2024/10/26 02:35:13 - mmengine - INFO - Iter(train) [53700/80000] base_lr: 8.8496e-05 lr: 8.8496e-05 eta: 14:17:56 time: 1.9450 data_time: 0.0189 memory: 6647 grad_norm: 3.6379 loss: 0.5101 decode.loss_ce: 0.3417 decode.acc_seg: 81.6845 aux.loss_ce: 0.1684 aux.acc_seg: 79.3586 2024/10/26 02:36:51 - mmengine - INFO - Iter(train) [53750/80000] base_lr: 8.8288e-05 lr: 8.8288e-05 eta: 14:16:18 time: 1.9598 data_time: 0.0173 memory: 6645 grad_norm: 6.7058 loss: 0.5785 decode.loss_ce: 0.3805 decode.acc_seg: 91.4162 aux.loss_ce: 0.1980 aux.acc_seg: 92.3410 2024/10/26 02:38:29 - mmengine - INFO - Iter(train) [53800/80000] base_lr: 8.8080e-05 lr: 8.8080e-05 eta: 14:14:40 time: 1.9533 data_time: 0.0187 memory: 6646 grad_norm: 3.9661 loss: 0.5347 decode.loss_ce: 0.3449 decode.acc_seg: 90.4489 aux.loss_ce: 0.1898 aux.acc_seg: 79.7596 2024/10/26 02:40:06 - mmengine - INFO - Iter(train) [53850/80000] base_lr: 8.7872e-05 lr: 8.7872e-05 eta: 14:13:02 time: 1.9565 data_time: 0.0182 memory: 6645 grad_norm: 6.7165 loss: 0.5609 decode.loss_ce: 0.3704 decode.acc_seg: 87.2685 aux.loss_ce: 0.1905 aux.acc_seg: 87.5873 2024/10/26 02:41:44 - mmengine - INFO - Iter(train) [53900/80000] base_lr: 8.7663e-05 lr: 8.7663e-05 eta: 14:11:24 time: 1.9502 data_time: 0.0202 memory: 6647 grad_norm: 4.6559 loss: 0.5621 decode.loss_ce: 0.3689 decode.acc_seg: 82.4051 aux.loss_ce: 0.1932 aux.acc_seg: 79.7559 2024/10/26 02:43:21 - mmengine - INFO - Iter(train) [53950/80000] base_lr: 8.7453e-05 lr: 8.7453e-05 eta: 14:09:46 time: 1.9433 data_time: 0.0192 memory: 6645 grad_norm: 10.9190 loss: 0.6218 decode.loss_ce: 0.4227 decode.acc_seg: 80.4762 aux.loss_ce: 0.1991 aux.acc_seg: 79.5046 2024/10/26 02:44:59 - mmengine - INFO - Exp name: deeplabv3_mobilemamba_b4-80k_ade20k-512x512_20241024_212005 2024/10/26 02:44:59 - mmengine - INFO - Iter(train) [54000/80000] base_lr: 8.7244e-05 lr: 8.7244e-05 eta: 14:08:08 time: 1.9430 data_time: 0.0186 memory: 6645 grad_norm: 5.1349 loss: 0.5210 decode.loss_ce: 0.3485 decode.acc_seg: 80.6732 aux.loss_ce: 0.1724 aux.acc_seg: 78.5256 2024/10/26 02:46:37 - mmengine - INFO - Iter(train) [54050/80000] base_lr: 8.7033e-05 lr: 8.7033e-05 eta: 14:06:30 time: 1.9506 data_time: 0.0187 memory: 6645 grad_norm: 5.8837 loss: 0.5980 decode.loss_ce: 0.3991 decode.acc_seg: 86.1017 aux.loss_ce: 0.1988 aux.acc_seg: 84.6409 2024/10/26 02:48:14 - mmengine - INFO - Iter(train) [54100/80000] base_lr: 8.6823e-05 lr: 8.6823e-05 eta: 14:04:52 time: 1.9485 data_time: 0.0192 memory: 6648 grad_norm: 5.2431 loss: 0.4398 decode.loss_ce: 0.2848 decode.acc_seg: 88.6970 aux.loss_ce: 0.1550 aux.acc_seg: 87.7960 2024/10/26 02:49:52 - mmengine - INFO - Iter(train) [54150/80000] base_lr: 8.6612e-05 lr: 8.6612e-05 eta: 14:03:14 time: 1.9526 data_time: 0.0195 memory: 6644 grad_norm: 4.8674 loss: 0.4835 decode.loss_ce: 0.3167 decode.acc_seg: 76.8727 aux.loss_ce: 0.1668 aux.acc_seg: 72.9634 2024/10/26 02:51:29 - mmengine - INFO - Iter(train) [54200/80000] base_lr: 8.6401e-05 lr: 8.6401e-05 eta: 14:01:36 time: 1.9520 data_time: 0.0186 memory: 6645 grad_norm: 4.6738 loss: 0.6213 decode.loss_ce: 0.4059 decode.acc_seg: 86.7017 aux.loss_ce: 0.2154 aux.acc_seg: 84.5631 2024/10/26 02:53:07 - mmengine - INFO - Iter(train) [54250/80000] base_lr: 8.6189e-05 lr: 8.6189e-05 eta: 13:59:58 time: 1.9443 data_time: 0.0183 memory: 6647 grad_norm: 4.0294 loss: 0.5195 decode.loss_ce: 0.3434 decode.acc_seg: 86.7668 aux.loss_ce: 0.1760 aux.acc_seg: 84.9515 2024/10/26 02:54:44 - mmengine - INFO - Iter(train) [54300/80000] base_lr: 8.5977e-05 lr: 8.5977e-05 eta: 13:58:20 time: 1.9389 data_time: 0.0176 memory: 6644 grad_norm: 3.9501 loss: 0.6408 decode.loss_ce: 0.4203 decode.acc_seg: 83.1058 aux.loss_ce: 0.2205 aux.acc_seg: 78.6041 2024/10/26 02:56:22 - mmengine - INFO - Iter(train) [54350/80000] base_lr: 8.5764e-05 lr: 8.5764e-05 eta: 13:56:42 time: 1.9660 data_time: 0.0192 memory: 6646 grad_norm: 4.5216 loss: 0.5420 decode.loss_ce: 0.3598 decode.acc_seg: 84.3027 aux.loss_ce: 0.1822 aux.acc_seg: 82.3139 2024/10/26 02:58:00 - mmengine - INFO - Iter(train) [54400/80000] base_lr: 8.5551e-05 lr: 8.5551e-05 eta: 13:55:04 time: 1.9457 data_time: 0.0189 memory: 6645 grad_norm: 4.7037 loss: 0.5190 decode.loss_ce: 0.3455 decode.acc_seg: 89.1858 aux.loss_ce: 0.1735 aux.acc_seg: 87.4628 2024/10/26 02:59:37 - mmengine - INFO - Iter(train) [54450/80000] base_lr: 8.5338e-05 lr: 8.5338e-05 eta: 13:53:26 time: 1.9467 data_time: 0.0179 memory: 6645 grad_norm: 4.5199 loss: 0.4768 decode.loss_ce: 0.3116 decode.acc_seg: 89.9228 aux.loss_ce: 0.1652 aux.acc_seg: 83.5333 2024/10/26 03:01:15 - mmengine - INFO - Iter(train) [54500/80000] base_lr: 8.5124e-05 lr: 8.5124e-05 eta: 13:51:48 time: 1.9557 data_time: 0.0184 memory: 6645 grad_norm: 4.0544 loss: 0.5092 decode.loss_ce: 0.3440 decode.acc_seg: 86.8415 aux.loss_ce: 0.1652 aux.acc_seg: 82.5731 2024/10/26 03:02:53 - mmengine - INFO - Iter(train) [54550/80000] base_lr: 8.4910e-05 lr: 8.4910e-05 eta: 13:50:10 time: 1.9598 data_time: 0.0174 memory: 6645 grad_norm: 3.6172 loss: 0.5107 decode.loss_ce: 0.3425 decode.acc_seg: 85.6169 aux.loss_ce: 0.1682 aux.acc_seg: 80.5480 2024/10/26 03:04:30 - mmengine - INFO - Iter(train) [54600/80000] base_lr: 8.4695e-05 lr: 8.4695e-05 eta: 13:48:33 time: 1.9552 data_time: 0.0199 memory: 6645 grad_norm: 5.0619 loss: 0.5311 decode.loss_ce: 0.3425 decode.acc_seg: 84.2588 aux.loss_ce: 0.1885 aux.acc_seg: 79.8501 2024/10/26 03:06:08 - mmengine - INFO - Iter(train) [54650/80000] base_lr: 8.4480e-05 lr: 8.4480e-05 eta: 13:46:55 time: 1.9494 data_time: 0.0179 memory: 6646 grad_norm: 4.1169 loss: 0.5911 decode.loss_ce: 0.3849 decode.acc_seg: 85.1311 aux.loss_ce: 0.2062 aux.acc_seg: 83.2027 2024/10/26 03:07:46 - mmengine - INFO - Iter(train) [54700/80000] base_lr: 8.4265e-05 lr: 8.4265e-05 eta: 13:45:17 time: 1.9496 data_time: 0.0194 memory: 6646 grad_norm: 4.2585 loss: 0.4989 decode.loss_ce: 0.3301 decode.acc_seg: 88.7273 aux.loss_ce: 0.1689 aux.acc_seg: 86.1336 2024/10/26 03:09:23 - mmengine - INFO - Iter(train) [54750/80000] base_lr: 8.4049e-05 lr: 8.4049e-05 eta: 13:43:39 time: 1.9477 data_time: 0.0188 memory: 6645 grad_norm: 5.5571 loss: 0.6705 decode.loss_ce: 0.4330 decode.acc_seg: 77.4944 aux.loss_ce: 0.2375 aux.acc_seg: 75.4895 2024/10/26 03:11:01 - mmengine - INFO - Iter(train) [54800/80000] base_lr: 8.3833e-05 lr: 8.3833e-05 eta: 13:42:01 time: 1.9659 data_time: 0.0161 memory: 6646 grad_norm: 6.2426 loss: 0.5479 decode.loss_ce: 0.3545 decode.acc_seg: 76.0472 aux.loss_ce: 0.1934 aux.acc_seg: 72.8693 2024/10/26 03:12:40 - mmengine - INFO - Iter(train) [54850/80000] base_lr: 8.3617e-05 lr: 8.3617e-05 eta: 13:40:23 time: 1.9703 data_time: 0.0182 memory: 6646 grad_norm: 3.9338 loss: 0.5684 decode.loss_ce: 0.3616 decode.acc_seg: 85.1188 aux.loss_ce: 0.2068 aux.acc_seg: 79.0246 2024/10/26 03:14:17 - mmengine - INFO - Iter(train) [54900/80000] base_lr: 8.3400e-05 lr: 8.3400e-05 eta: 13:38:45 time: 1.9535 data_time: 0.0187 memory: 6646 grad_norm: 7.3399 loss: 0.5302 decode.loss_ce: 0.3457 decode.acc_seg: 87.7725 aux.loss_ce: 0.1845 aux.acc_seg: 84.0929 2024/10/26 03:15:55 - mmengine - INFO - Iter(train) [54950/80000] base_lr: 8.3183e-05 lr: 8.3183e-05 eta: 13:37:07 time: 1.9602 data_time: 0.0186 memory: 6646 grad_norm: 4.4286 loss: 0.5338 decode.loss_ce: 0.3565 decode.acc_seg: 88.5675 aux.loss_ce: 0.1773 aux.acc_seg: 87.4943 2024/10/26 03:17:33 - mmengine - INFO - Exp name: deeplabv3_mobilemamba_b4-80k_ade20k-512x512_20241024_212005 2024/10/26 03:17:33 - mmengine - INFO - Iter(train) [55000/80000] base_lr: 8.2965e-05 lr: 8.2965e-05 eta: 13:35:29 time: 1.9427 data_time: 0.0182 memory: 6646 grad_norm: 5.2545 loss: 0.4764 decode.loss_ce: 0.3200 decode.acc_seg: 89.6739 aux.loss_ce: 0.1564 aux.acc_seg: 90.5918 2024/10/26 03:19:11 - mmengine - INFO - Iter(train) [55050/80000] base_lr: 8.2748e-05 lr: 8.2748e-05 eta: 13:33:52 time: 1.9469 data_time: 0.0182 memory: 6645 grad_norm: 5.9241 loss: 0.4895 decode.loss_ce: 0.3347 decode.acc_seg: 85.8063 aux.loss_ce: 0.1547 aux.acc_seg: 86.2499 2024/10/26 03:20:49 - mmengine - INFO - Iter(train) [55100/80000] base_lr: 8.2529e-05 lr: 8.2529e-05 eta: 13:32:14 time: 1.9516 data_time: 0.0192 memory: 6646 grad_norm: 3.1606 loss: 0.5664 decode.loss_ce: 0.3808 decode.acc_seg: 85.1114 aux.loss_ce: 0.1857 aux.acc_seg: 85.5483 2024/10/26 03:22:27 - mmengine - INFO - Iter(train) [55150/80000] base_lr: 8.2311e-05 lr: 8.2311e-05 eta: 13:30:36 time: 1.9463 data_time: 0.0187 memory: 6646 grad_norm: 4.3885 loss: 0.5226 decode.loss_ce: 0.3493 decode.acc_seg: 81.8608 aux.loss_ce: 0.1733 aux.acc_seg: 81.4842 2024/10/26 03:24:04 - mmengine - INFO - Iter(train) [55200/80000] base_lr: 8.2092e-05 lr: 8.2092e-05 eta: 13:28:58 time: 1.9499 data_time: 0.0187 memory: 6646 grad_norm: 5.2676 loss: 0.5551 decode.loss_ce: 0.3625 decode.acc_seg: 86.7767 aux.loss_ce: 0.1925 aux.acc_seg: 85.3506 2024/10/26 03:25:43 - mmengine - INFO - Iter(train) [55250/80000] base_lr: 8.1873e-05 lr: 8.1873e-05 eta: 13:27:20 time: 1.9484 data_time: 0.0181 memory: 6646 grad_norm: 3.8203 loss: 0.5159 decode.loss_ce: 0.3424 decode.acc_seg: 87.6604 aux.loss_ce: 0.1735 aux.acc_seg: 87.4629 2024/10/26 03:27:21 - mmengine - INFO - Iter(train) [55300/80000] base_lr: 8.1653e-05 lr: 8.1653e-05 eta: 13:25:43 time: 1.9537 data_time: 0.0179 memory: 6646 grad_norm: 5.5117 loss: 0.5537 decode.loss_ce: 0.3609 decode.acc_seg: 85.9960 aux.loss_ce: 0.1928 aux.acc_seg: 86.9554 2024/10/26 03:28:58 - mmengine - INFO - Iter(train) [55350/80000] base_lr: 8.1433e-05 lr: 8.1433e-05 eta: 13:24:04 time: 1.9449 data_time: 0.0198 memory: 6647 grad_norm: 5.5254 loss: 0.5834 decode.loss_ce: 0.3868 decode.acc_seg: 89.5255 aux.loss_ce: 0.1966 aux.acc_seg: 86.6981 2024/10/26 03:30:35 - mmengine - INFO - Iter(train) [55400/80000] base_lr: 8.1213e-05 lr: 8.1213e-05 eta: 13:22:26 time: 1.9478 data_time: 0.0199 memory: 6646 grad_norm: 4.2621 loss: 0.4933 decode.loss_ce: 0.3362 decode.acc_seg: 80.3191 aux.loss_ce: 0.1571 aux.acc_seg: 77.4778 2024/10/26 03:32:13 - mmengine - INFO - Iter(train) [55450/80000] base_lr: 8.0992e-05 lr: 8.0992e-05 eta: 13:20:48 time: 1.9408 data_time: 0.0197 memory: 6645 grad_norm: 3.5391 loss: 0.5156 decode.loss_ce: 0.3449 decode.acc_seg: 80.5553 aux.loss_ce: 0.1707 aux.acc_seg: 76.3252 2024/10/26 03:33:50 - mmengine - INFO - Iter(train) [55500/80000] base_lr: 8.0771e-05 lr: 8.0771e-05 eta: 13:19:10 time: 1.9488 data_time: 0.0193 memory: 6646 grad_norm: 4.3464 loss: 0.5971 decode.loss_ce: 0.3965 decode.acc_seg: 82.2910 aux.loss_ce: 0.2006 aux.acc_seg: 74.5190 2024/10/26 03:35:28 - mmengine - INFO - Iter(train) [55550/80000] base_lr: 8.0550e-05 lr: 8.0550e-05 eta: 13:17:32 time: 1.9447 data_time: 0.0197 memory: 6645 grad_norm: 5.5302 loss: 0.4645 decode.loss_ce: 0.3061 decode.acc_seg: 91.2024 aux.loss_ce: 0.1584 aux.acc_seg: 91.1912 2024/10/26 03:37:05 - mmengine - INFO - Iter(train) [55600/80000] base_lr: 8.0329e-05 lr: 8.0329e-05 eta: 13:15:54 time: 1.9506 data_time: 0.0169 memory: 6644 grad_norm: 4.5059 loss: 0.5670 decode.loss_ce: 0.3759 decode.acc_seg: 85.4483 aux.loss_ce: 0.1911 aux.acc_seg: 84.9417 2024/10/26 03:38:43 - mmengine - INFO - Iter(train) [55650/80000] base_lr: 8.0107e-05 lr: 8.0107e-05 eta: 13:14:17 time: 1.9520 data_time: 0.0187 memory: 6645 grad_norm: 4.1596 loss: 0.4953 decode.loss_ce: 0.3327 decode.acc_seg: 85.9969 aux.loss_ce: 0.1626 aux.acc_seg: 86.2987 2024/10/26 03:40:21 - mmengine - INFO - Iter(train) [55700/80000] base_lr: 7.9885e-05 lr: 7.9885e-05 eta: 13:12:38 time: 1.9509 data_time: 0.0193 memory: 6645 grad_norm: 5.4745 loss: 0.5308 decode.loss_ce: 0.3498 decode.acc_seg: 81.9814 aux.loss_ce: 0.1810 aux.acc_seg: 78.6200 2024/10/26 03:41:59 - mmengine - INFO - Iter(train) [55750/80000] base_lr: 7.9662e-05 lr: 7.9662e-05 eta: 13:11:01 time: 1.9471 data_time: 0.0183 memory: 6645 grad_norm: 4.2869 loss: 0.5501 decode.loss_ce: 0.3646 decode.acc_seg: 87.8107 aux.loss_ce: 0.1855 aux.acc_seg: 86.8188 2024/10/26 03:43:36 - mmengine - INFO - Iter(train) [55800/80000] base_lr: 7.9440e-05 lr: 7.9440e-05 eta: 13:09:23 time: 1.9483 data_time: 0.0197 memory: 6645 grad_norm: 4.1425 loss: 0.5077 decode.loss_ce: 0.3395 decode.acc_seg: 88.3067 aux.loss_ce: 0.1682 aux.acc_seg: 85.2395 2024/10/26 03:45:14 - mmengine - INFO - Iter(train) [55850/80000] base_lr: 7.9216e-05 lr: 7.9216e-05 eta: 13:07:45 time: 1.9481 data_time: 0.0189 memory: 6644 grad_norm: 4.8202 loss: 0.6089 decode.loss_ce: 0.4023 decode.acc_seg: 80.6689 aux.loss_ce: 0.2067 aux.acc_seg: 80.5795 2024/10/26 03:46:52 - mmengine - INFO - Iter(train) [55900/80000] base_lr: 7.8993e-05 lr: 7.8993e-05 eta: 13:06:07 time: 1.9696 data_time: 0.0193 memory: 6645 grad_norm: 7.5311 loss: 0.5874 decode.loss_ce: 0.3984 decode.acc_seg: 74.3564 aux.loss_ce: 0.1891 aux.acc_seg: 67.8599 2024/10/26 03:48:29 - mmengine - INFO - Iter(train) [55950/80000] base_lr: 7.8769e-05 lr: 7.8769e-05 eta: 13:04:29 time: 1.9494 data_time: 0.0184 memory: 6646 grad_norm: 4.8177 loss: 0.5914 decode.loss_ce: 0.3720 decode.acc_seg: 79.3881 aux.loss_ce: 0.2194 aux.acc_seg: 67.0185 2024/10/26 03:50:07 - mmengine - INFO - Exp name: deeplabv3_mobilemamba_b4-80k_ade20k-512x512_20241024_212005 2024/10/26 03:50:07 - mmengine - INFO - Iter(train) [56000/80000] base_lr: 7.8546e-05 lr: 7.8546e-05 eta: 13:02:51 time: 1.9585 data_time: 0.0189 memory: 6646 grad_norm: 5.6847 loss: 0.5432 decode.loss_ce: 0.3608 decode.acc_seg: 80.4825 aux.loss_ce: 0.1823 aux.acc_seg: 75.8823 2024/10/26 03:50:07 - mmengine - INFO - Saving checkpoint at 56000 iterations 2024/10/26 03:50:11 - mmengine - INFO - Iter(val) [ 50/500] eta: 0:00:15 time: 0.0333 data_time: 0.0017 memory: 1049 2024/10/26 03:50:13 - mmengine - INFO - Iter(val) [100/500] eta: 0:00:13 time: 0.0323 data_time: 0.0015 memory: 1117 2024/10/26 03:50:15 - mmengine - INFO - Iter(val) [150/500] eta: 0:00:11 time: 0.0347 data_time: 0.0019 memory: 833 2024/10/26 03:50:16 - mmengine - INFO - Iter(val) [200/500] eta: 0:00:10 time: 0.0335 data_time: 0.0017 memory: 866 2024/10/26 03:50:18 - mmengine - INFO - Iter(val) [250/500] eta: 0:00:08 time: 0.0334 data_time: 0.0018 memory: 906 2024/10/26 03:50:20 - mmengine - INFO - Iter(val) [300/500] eta: 0:00:06 time: 0.0344 data_time: 0.0022 memory: 2028 2024/10/26 03:50:22 - mmengine - INFO - Iter(val) [350/500] eta: 0:00:05 time: 0.0322 data_time: 0.0017 memory: 832 2024/10/26 03:50:23 - mmengine - INFO - Iter(val) [400/500] eta: 0:00:03 time: 0.0332 data_time: 0.0017 memory: 904 2024/10/26 03:50:25 - mmengine - INFO - Iter(val) [450/500] eta: 0:00:01 time: 0.0318 data_time: 0.0015 memory: 839 2024/10/26 03:50:26 - mmengine - INFO - Iter(val) [500/500] eta: 0:00:00 time: 0.0329 data_time: 0.0016 memory: 889 2024/10/26 03:50:28 - mmengine - INFO - per class results: 2024/10/26 03:50:28 - mmengine - INFO - +---------------------+-------+-------+ | Class | IoU | Acc | +---------------------+-------+-------+ | wall | 67.02 | 83.57 | | building | 75.71 | 89.95 | | sky | 88.32 | 94.46 | | floor | 70.05 | 87.37 | | tree | 65.8 | 82.97 | | ceiling | 76.18 | 88.55 | | road | 75.56 | 88.35 | | bed | 78.7 | 88.23 | | windowpane | 50.33 | 68.23 | | grass | 62.69 | 78.9 | | cabinet | 51.33 | 64.5 | | sidewalk | 52.51 | 66.7 | | person | 58.72 | 74.97 | | earth | 31.62 | 49.96 | | door | 34.19 | 46.16 | | table | 39.73 | 55.84 | | mountain | 46.83 | 58.34 | | plant | 40.2 | 49.44 | | curtain | 53.94 | 69.92 | | chair | 38.19 | 52.87 | | car | 68.52 | 81.6 | | water | 40.25 | 51.39 | | painting | 49.45 | 63.6 | | sofa | 53.5 | 76.21 | | shelf | 29.98 | 40.75 | | house | 28.2 | 34.29 | | sea | 35.79 | 55.3 | | mirror | 51.81 | 59.4 | | rug | 48.28 | 58.04 | | field | 34.63 | 44.48 | | armchair | 29.43 | 42.43 | | seat | 50.01 | 75.43 | | fence | 35.49 | 43.67 | | desk | 39.24 | 49.65 | | rock | 37.39 | 56.78 | | wardrobe | 41.32 | 59.23 | | lamp | 31.33 | 39.97 | | bathtub | 65.33 | 75.44 | | railing | 24.07 | 31.94 | | cushion | 32.67 | 40.62 | | base | 15.25 | 28.21 | | box | 13.85 | 19.85 | | column | 19.61 | 23.24 | | signboard | 18.62 | 24.35 | | chest of drawers | 36.85 | 58.7 | | counter | 28.88 | 35.28 | | sand | 26.72 | 41.38 | | sink | 51.73 | 62.56 | | skyscraper | 55.61 | 80.91 | | fireplace | 58.05 | 68.8 | | refrigerator | 68.1 | 79.26 | | grandstand | 38.13 | 67.44 | | path | 18.58 | 29.56 | | stairs | 11.97 | 16.7 | | runway | 56.1 | 70.24 | | case | 42.42 | 55.53 | | pool table | 76.13 | 82.44 | | pillow | 37.69 | 47.01 | | screen door | 55.31 | 65.41 | | stairway | 19.31 | 32.69 | | river | 9.01 | 34.73 | | bridge | 31.73 | 36.41 | | bookcase | 24.85 | 33.04 | | blind | 37.26 | 44.24 | | coffee table | 44.32 | 58.0 | | toilet | 63.62 | 71.33 | | flower | 22.45 | 37.43 | | book | 33.25 | 49.78 | | hill | 4.68 | 11.36 | | bench | 34.94 | 41.1 | | countertop | 40.14 | 51.88 | | stove | 60.7 | 73.55 | | palm | 35.8 | 50.68 | | kitchen island | 23.06 | 49.23 | | computer | 45.12 | 54.95 | | swivel chair | 26.73 | 32.12 | | boat | 44.25 | 61.13 | | bar | 25.8 | 33.85 | | arcade machine | 21.4 | 22.44 | | hovel | 28.65 | 31.88 | | bus | 71.97 | 83.58 | | towel | 39.45 | 48.75 | | light | 10.18 | 11.17 | | truck | 17.66 | 25.17 | | tower | 11.92 | 19.87 | | chandelier | 48.07 | 65.28 | | awning | 13.44 | 14.35 | | streetlight | 4.7 | 5.49 | | booth | 56.41 | 63.83 | | television receiver | 51.79 | 64.48 | | airplane | 37.21 | 46.87 | | dirt track | 3.93 | 20.02 | | apparel | 28.18 | 41.46 | | pole | 4.87 | 6.01 | | land | 0.96 | 1.76 | | bannister | 4.19 | 5.85 | | escalator | 42.08 | 61.66 | | ottoman | 30.59 | 45.11 | | bottle | 19.94 | 34.97 | | buffet | 38.07 | 41.06 | | poster | 22.25 | 35.06 | | stage | 10.44 | 16.82 | | van | 38.81 | 46.16 | | ship | 52.88 | 67.69 | | fountain | 18.0 | 18.2 | | conveyer belt | 39.11 | 57.98 | | canopy | 19.44 | 23.83 | | washer | 55.35 | 57.92 | | plaything | 5.97 | 9.52 | | swimming pool | 63.69 | 65.06 | | stool | 22.81 | 30.23 | | barrel | 36.86 | 64.61 | | basket | 13.49 | 19.67 | | waterfall | 60.01 | 70.92 | | tent | 89.43 | 93.48 | | bag | 4.69 | 6.37 | | minibike | 48.06 | 57.88 | | cradle | 52.22 | 87.53 | | oven | 39.66 | 50.53 | | ball | 0.68 | 0.81 | | food | 39.8 | 52.37 | | step | 11.45 | 18.18 | | tank | 45.46 | 50.7 | | trade name | 14.59 | 18.35 | | microwave | 42.92 | 48.84 | | pot | 18.55 | 22.09 | | animal | 43.99 | 51.15 | | bicycle | 29.45 | 48.42 | | lake | 9.0 | 9.32 | | dishwasher | 38.7 | 41.93 | | screen | 57.38 | 74.4 | | blanket | 8.17 | 11.81 | | sculpture | 29.65 | 42.4 | | hood | 42.45 | 49.36 | | sconce | 11.5 | 12.83 | | vase | 15.88 | 21.94 | | traffic light | 8.32 | 12.63 | | tray | 1.77 | 3.39 | | ashcan | 24.08 | 30.35 | | fan | 24.17 | 30.91 | | pier | 7.83 | 9.48 | | crt screen | 5.52 | 14.99 | | plate | 18.85 | 22.06 | | monitor | 1.33 | 1.36 | | bulletin board | 32.18 | 40.35 | | shower | 1.34 | 1.65 | | radiator | 30.65 | 35.8 | | glass | 2.15 | 2.25 | | clock | 18.67 | 20.75 | | flag | 14.06 | 14.98 | +---------------------+-------+-------+ 2024/10/26 03:50:28 - mmengine - INFO - Iter(val) [500/500] aAcc: 75.4100 mIoU: 35.1400 mAcc: 45.2700 data_time: 0.0018 time: 0.0333 2024/10/26 03:52:05 - mmengine - INFO - Iter(train) [56050/80000] base_lr: 7.8321e-05 lr: 7.8321e-05 eta: 13:01:13 time: 1.9477 data_time: 0.0184 memory: 6645 grad_norm: 3.8638 loss: 0.5296 decode.loss_ce: 0.3482 decode.acc_seg: 87.7890 aux.loss_ce: 0.1814 aux.acc_seg: 79.3946 2024/10/26 03:53:44 - mmengine - INFO - Iter(train) [56100/80000] base_lr: 7.8097e-05 lr: 7.8097e-05 eta: 12:59:36 time: 1.9498 data_time: 0.0193 memory: 6647 grad_norm: 4.3952 loss: 0.6454 decode.loss_ce: 0.4408 decode.acc_seg: 87.8161 aux.loss_ce: 0.2046 aux.acc_seg: 87.7877 2024/10/26 03:55:21 - mmengine - INFO - Iter(train) [56150/80000] base_lr: 7.7872e-05 lr: 7.7872e-05 eta: 12:57:58 time: 1.9599 data_time: 0.0183 memory: 6646 grad_norm: 5.8944 loss: 0.5143 decode.loss_ce: 0.3433 decode.acc_seg: 91.1518 aux.loss_ce: 0.1710 aux.acc_seg: 92.1869 2024/10/26 03:56:59 - mmengine - INFO - Iter(train) [56200/80000] base_lr: 7.7647e-05 lr: 7.7647e-05 eta: 12:56:20 time: 1.9499 data_time: 0.0187 memory: 6645 grad_norm: 3.9563 loss: 0.4734 decode.loss_ce: 0.3098 decode.acc_seg: 87.0069 aux.loss_ce: 0.1636 aux.acc_seg: 87.7479 2024/10/26 03:58:37 - mmengine - INFO - Iter(train) [56250/80000] base_lr: 7.7422e-05 lr: 7.7422e-05 eta: 12:54:42 time: 1.9605 data_time: 0.0180 memory: 6646 grad_norm: 3.9914 loss: 0.6160 decode.loss_ce: 0.4037 decode.acc_seg: 86.9457 aux.loss_ce: 0.2124 aux.acc_seg: 80.3741 2024/10/26 04:00:15 - mmengine - INFO - Iter(train) [56300/80000] base_lr: 7.7196e-05 lr: 7.7196e-05 eta: 12:53:04 time: 1.9521 data_time: 0.0191 memory: 6646 grad_norm: 4.5759 loss: 0.5108 decode.loss_ce: 0.3293 decode.acc_seg: 88.4594 aux.loss_ce: 0.1815 aux.acc_seg: 87.3258 2024/10/26 04:01:53 - mmengine - INFO - Iter(train) [56350/80000] base_lr: 7.6970e-05 lr: 7.6970e-05 eta: 12:51:26 time: 1.9515 data_time: 0.0186 memory: 6645 grad_norm: 3.3405 loss: 0.4977 decode.loss_ce: 0.3262 decode.acc_seg: 79.3880 aux.loss_ce: 0.1715 aux.acc_seg: 75.7785 2024/10/26 04:03:30 - mmengine - INFO - Iter(train) [56400/80000] base_lr: 7.6744e-05 lr: 7.6744e-05 eta: 12:49:48 time: 1.9466 data_time: 0.0191 memory: 6646 grad_norm: 5.6512 loss: 0.4971 decode.loss_ce: 0.3306 decode.acc_seg: 86.5372 aux.loss_ce: 0.1665 aux.acc_seg: 86.7912 2024/10/26 04:05:08 - mmengine - INFO - Iter(train) [56450/80000] base_lr: 7.6518e-05 lr: 7.6518e-05 eta: 12:48:11 time: 1.9525 data_time: 0.0202 memory: 6646 grad_norm: 6.3478 loss: 0.4485 decode.loss_ce: 0.3018 decode.acc_seg: 86.6951 aux.loss_ce: 0.1467 aux.acc_seg: 84.0362 2024/10/26 04:06:46 - mmengine - INFO - Iter(train) [56500/80000] base_lr: 7.6291e-05 lr: 7.6291e-05 eta: 12:46:33 time: 1.9523 data_time: 0.0186 memory: 6647 grad_norm: 4.6489 loss: 0.6051 decode.loss_ce: 0.4050 decode.acc_seg: 78.5788 aux.loss_ce: 0.2002 aux.acc_seg: 76.4250 2024/10/26 04:08:24 - mmengine - INFO - Iter(train) [56550/80000] base_lr: 7.6064e-05 lr: 7.6064e-05 eta: 12:44:55 time: 1.9613 data_time: 0.0189 memory: 6646 grad_norm: 5.0108 loss: 0.4583 decode.loss_ce: 0.3041 decode.acc_seg: 85.4942 aux.loss_ce: 0.1543 aux.acc_seg: 84.7816 2024/10/26 04:10:01 - mmengine - INFO - Iter(train) [56600/80000] base_lr: 7.5837e-05 lr: 7.5837e-05 eta: 12:43:17 time: 1.9550 data_time: 0.0192 memory: 6645 grad_norm: 5.2241 loss: 0.5211 decode.loss_ce: 0.3368 decode.acc_seg: 80.6740 aux.loss_ce: 0.1842 aux.acc_seg: 79.5207 2024/10/26 04:11:43 - mmengine - INFO - Iter(train) [56650/80000] base_lr: 7.5610e-05 lr: 7.5610e-05 eta: 12:41:40 time: 1.9476 data_time: 0.0197 memory: 6646 grad_norm: 4.5116 loss: 0.5115 decode.loss_ce: 0.3297 decode.acc_seg: 82.8979 aux.loss_ce: 0.1817 aux.acc_seg: 80.4770 2024/10/26 04:13:20 - mmengine - INFO - Iter(train) [56700/80000] base_lr: 7.5382e-05 lr: 7.5382e-05 eta: 12:40:02 time: 1.9595 data_time: 0.0182 memory: 6646 grad_norm: 5.5303 loss: 0.4858 decode.loss_ce: 0.3289 decode.acc_seg: 84.9891 aux.loss_ce: 0.1569 aux.acc_seg: 83.2702 2024/10/26 04:14:58 - mmengine - INFO - Iter(train) [56750/80000] base_lr: 7.5154e-05 lr: 7.5154e-05 eta: 12:38:24 time: 1.9540 data_time: 0.0195 memory: 6645 grad_norm: 4.1625 loss: 0.5263 decode.loss_ce: 0.3497 decode.acc_seg: 84.9232 aux.loss_ce: 0.1766 aux.acc_seg: 82.0224 2024/10/26 04:16:35 - mmengine - INFO - Iter(train) [56800/80000] base_lr: 7.4926e-05 lr: 7.4926e-05 eta: 12:36:46 time: 1.9554 data_time: 0.0176 memory: 6646 grad_norm: 4.9758 loss: 0.4641 decode.loss_ce: 0.3076 decode.acc_seg: 91.8447 aux.loss_ce: 0.1565 aux.acc_seg: 90.5569 2024/10/26 04:18:14 - mmengine - INFO - Iter(train) [56850/80000] base_lr: 7.4698e-05 lr: 7.4698e-05 eta: 12:35:09 time: 1.9521 data_time: 0.0178 memory: 6645 grad_norm: 4.5974 loss: 0.5240 decode.loss_ce: 0.3477 decode.acc_seg: 87.9701 aux.loss_ce: 0.1763 aux.acc_seg: 86.7361 2024/10/26 04:19:51 - mmengine - INFO - Iter(train) [56900/80000] base_lr: 7.4469e-05 lr: 7.4469e-05 eta: 12:33:31 time: 1.9396 data_time: 0.0191 memory: 6646 grad_norm: 4.2946 loss: 0.4903 decode.loss_ce: 0.3337 decode.acc_seg: 90.8525 aux.loss_ce: 0.1566 aux.acc_seg: 90.1938 2024/10/26 04:21:29 - mmengine - INFO - Iter(train) [56950/80000] base_lr: 7.4240e-05 lr: 7.4240e-05 eta: 12:31:53 time: 1.9466 data_time: 0.0176 memory: 6646 grad_norm: 4.1710 loss: 0.5071 decode.loss_ce: 0.3412 decode.acc_seg: 84.4548 aux.loss_ce: 0.1658 aux.acc_seg: 83.1250 2024/10/26 04:23:07 - mmengine - INFO - Exp name: deeplabv3_mobilemamba_b4-80k_ade20k-512x512_20241024_212005 2024/10/26 04:23:07 - mmengine - INFO - Iter(train) [57000/80000] base_lr: 7.4011e-05 lr: 7.4011e-05 eta: 12:30:15 time: 1.9522 data_time: 0.0168 memory: 6647 grad_norm: 4.8410 loss: 0.4773 decode.loss_ce: 0.3075 decode.acc_seg: 83.2267 aux.loss_ce: 0.1698 aux.acc_seg: 83.0233 2024/10/26 04:24:44 - mmengine - INFO - Iter(train) [57050/80000] base_lr: 7.3782e-05 lr: 7.3782e-05 eta: 12:28:37 time: 1.9543 data_time: 0.0185 memory: 6646 grad_norm: 4.2882 loss: 0.4685 decode.loss_ce: 0.3125 decode.acc_seg: 87.2213 aux.loss_ce: 0.1560 aux.acc_seg: 84.5884 2024/10/26 04:26:22 - mmengine - INFO - Iter(train) [57100/80000] base_lr: 7.3553e-05 lr: 7.3553e-05 eta: 12:26:59 time: 1.9532 data_time: 0.0177 memory: 6646 grad_norm: 4.4139 loss: 0.5133 decode.loss_ce: 0.3374 decode.acc_seg: 80.3183 aux.loss_ce: 0.1758 aux.acc_seg: 77.9025 2024/10/26 04:28:00 - mmengine - INFO - Iter(train) [57150/80000] base_lr: 7.3323e-05 lr: 7.3323e-05 eta: 12:25:21 time: 1.9735 data_time: 0.0192 memory: 6646 grad_norm: 4.2681 loss: 0.4643 decode.loss_ce: 0.3069 decode.acc_seg: 86.6005 aux.loss_ce: 0.1574 aux.acc_seg: 85.1659 2024/10/26 04:29:38 - mmengine - INFO - Iter(train) [57200/80000] base_lr: 7.3093e-05 lr: 7.3093e-05 eta: 12:23:43 time: 1.9531 data_time: 0.0187 memory: 6645 grad_norm: 3.9844 loss: 0.5592 decode.loss_ce: 0.3824 decode.acc_seg: 86.4432 aux.loss_ce: 0.1768 aux.acc_seg: 83.1506 2024/10/26 04:31:15 - mmengine - INFO - Iter(train) [57250/80000] base_lr: 7.2863e-05 lr: 7.2863e-05 eta: 12:22:05 time: 1.9475 data_time: 0.0187 memory: 6645 grad_norm: 4.8325 loss: 0.4834 decode.loss_ce: 0.3179 decode.acc_seg: 80.9308 aux.loss_ce: 0.1655 aux.acc_seg: 76.3456 2024/10/26 04:32:53 - mmengine - INFO - Iter(train) [57300/80000] base_lr: 7.2633e-05 lr: 7.2633e-05 eta: 12:20:27 time: 1.9474 data_time: 0.0188 memory: 6646 grad_norm: 4.3451 loss: 0.5078 decode.loss_ce: 0.3398 decode.acc_seg: 83.6158 aux.loss_ce: 0.1680 aux.acc_seg: 84.0575 2024/10/26 04:34:30 - mmengine - INFO - Iter(train) [57350/80000] base_lr: 7.2402e-05 lr: 7.2402e-05 eta: 12:18:49 time: 1.9569 data_time: 0.0190 memory: 6646 grad_norm: 4.8035 loss: 0.5326 decode.loss_ce: 0.3611 decode.acc_seg: 85.9921 aux.loss_ce: 0.1715 aux.acc_seg: 83.1048 2024/10/26 04:36:08 - mmengine - INFO - Iter(train) [57400/80000] base_lr: 7.2172e-05 lr: 7.2172e-05 eta: 12:17:11 time: 1.9500 data_time: 0.0189 memory: 6646 grad_norm: 4.0188 loss: 0.5352 decode.loss_ce: 0.3547 decode.acc_seg: 79.8861 aux.loss_ce: 0.1805 aux.acc_seg: 76.7244 2024/10/26 04:37:46 - mmengine - INFO - Iter(train) [57450/80000] base_lr: 7.1941e-05 lr: 7.1941e-05 eta: 12:15:34 time: 1.9499 data_time: 0.0200 memory: 6645 grad_norm: 4.5690 loss: 0.5484 decode.loss_ce: 0.3660 decode.acc_seg: 87.5944 aux.loss_ce: 0.1824 aux.acc_seg: 86.9011 2024/10/26 04:39:23 - mmengine - INFO - Iter(train) [57500/80000] base_lr: 7.1710e-05 lr: 7.1710e-05 eta: 12:13:56 time: 1.9498 data_time: 0.0185 memory: 6645 grad_norm: 4.6547 loss: 0.5211 decode.loss_ce: 0.3406 decode.acc_seg: 81.0723 aux.loss_ce: 0.1806 aux.acc_seg: 78.3089 2024/10/26 04:41:01 - mmengine - INFO - Iter(train) [57550/80000] base_lr: 7.1479e-05 lr: 7.1479e-05 eta: 12:12:18 time: 1.9446 data_time: 0.0193 memory: 6645 grad_norm: 5.6794 loss: 0.5209 decode.loss_ce: 0.3509 decode.acc_seg: 80.5779 aux.loss_ce: 0.1700 aux.acc_seg: 79.7287 2024/10/26 04:42:39 - mmengine - INFO - Iter(train) [57600/80000] base_lr: 7.1248e-05 lr: 7.1248e-05 eta: 12:10:40 time: 1.9530 data_time: 0.0184 memory: 6647 grad_norm: 5.4204 loss: 0.5045 decode.loss_ce: 0.3385 decode.acc_seg: 86.0886 aux.loss_ce: 0.1660 aux.acc_seg: 83.9669 2024/10/26 04:44:17 - mmengine - INFO - Iter(train) [57650/80000] base_lr: 7.1016e-05 lr: 7.1016e-05 eta: 12:09:02 time: 1.9520 data_time: 0.0189 memory: 6646 grad_norm: 7.1380 loss: 0.5752 decode.loss_ce: 0.3913 decode.acc_seg: 89.7436 aux.loss_ce: 0.1840 aux.acc_seg: 88.0178 2024/10/26 04:45:55 - mmengine - INFO - Iter(train) [57700/80000] base_lr: 7.0784e-05 lr: 7.0784e-05 eta: 12:07:24 time: 1.9418 data_time: 0.0190 memory: 6646 grad_norm: 4.6277 loss: 0.4906 decode.loss_ce: 0.3223 decode.acc_seg: 86.8462 aux.loss_ce: 0.1683 aux.acc_seg: 82.2112 2024/10/26 04:47:33 - mmengine - INFO - Iter(train) [57750/80000] base_lr: 7.0552e-05 lr: 7.0552e-05 eta: 12:05:46 time: 1.9537 data_time: 0.0177 memory: 6645 grad_norm: 4.6592 loss: 0.6409 decode.loss_ce: 0.4136 decode.acc_seg: 82.1904 aux.loss_ce: 0.2273 aux.acc_seg: 78.7319 2024/10/26 04:49:11 - mmengine - INFO - Iter(train) [57800/80000] base_lr: 7.0320e-05 lr: 7.0320e-05 eta: 12:04:08 time: 1.9490 data_time: 0.0194 memory: 6645 grad_norm: 4.8801 loss: 0.6588 decode.loss_ce: 0.4475 decode.acc_seg: 87.0398 aux.loss_ce: 0.2113 aux.acc_seg: 87.4343 2024/10/26 04:50:49 - mmengine - INFO - Iter(train) [57850/80000] base_lr: 7.0088e-05 lr: 7.0088e-05 eta: 12:02:31 time: 1.9553 data_time: 0.0198 memory: 6646 grad_norm: 3.5240 loss: 0.5211 decode.loss_ce: 0.3495 decode.acc_seg: 81.4835 aux.loss_ce: 0.1715 aux.acc_seg: 78.5281 2024/10/26 04:52:26 - mmengine - INFO - Iter(train) [57900/80000] base_lr: 6.9856e-05 lr: 6.9856e-05 eta: 12:00:53 time: 1.9484 data_time: 0.0205 memory: 6646 grad_norm: 4.1493 loss: 0.5621 decode.loss_ce: 0.3830 decode.acc_seg: 83.4990 aux.loss_ce: 0.1791 aux.acc_seg: 87.1865 2024/10/26 04:54:04 - mmengine - INFO - Iter(train) [57950/80000] base_lr: 6.9623e-05 lr: 6.9623e-05 eta: 11:59:15 time: 1.9507 data_time: 0.0197 memory: 6646 grad_norm: 6.7595 loss: 0.4358 decode.loss_ce: 0.2916 decode.acc_seg: 82.4665 aux.loss_ce: 0.1442 aux.acc_seg: 82.9152 2024/10/26 04:55:43 - mmengine - INFO - Exp name: deeplabv3_mobilemamba_b4-80k_ade20k-512x512_20241024_212005 2024/10/26 04:55:43 - mmengine - INFO - Iter(train) [58000/80000] base_lr: 6.9391e-05 lr: 6.9391e-05 eta: 11:57:37 time: 1.9567 data_time: 0.0194 memory: 6646 grad_norm: 4.6478 loss: 0.5663 decode.loss_ce: 0.3830 decode.acc_seg: 83.6281 aux.loss_ce: 0.1833 aux.acc_seg: 84.8150 2024/10/26 04:57:21 - mmengine - INFO - Iter(train) [58050/80000] base_lr: 6.9158e-05 lr: 6.9158e-05 eta: 11:55:59 time: 1.9487 data_time: 0.0197 memory: 6646 grad_norm: 3.8875 loss: 0.4897 decode.loss_ce: 0.3274 decode.acc_seg: 87.3128 aux.loss_ce: 0.1622 aux.acc_seg: 86.0051 2024/10/26 04:58:58 - mmengine - INFO - Iter(train) [58100/80000] base_lr: 6.8925e-05 lr: 6.8925e-05 eta: 11:54:21 time: 1.9504 data_time: 0.0191 memory: 6645 grad_norm: 4.1659 loss: 0.4954 decode.loss_ce: 0.3412 decode.acc_seg: 84.6512 aux.loss_ce: 0.1542 aux.acc_seg: 79.9191 2024/10/26 05:00:36 - mmengine - INFO - Iter(train) [58150/80000] base_lr: 6.8692e-05 lr: 6.8692e-05 eta: 11:52:43 time: 1.9499 data_time: 0.0189 memory: 6645 grad_norm: 3.9471 loss: 0.4553 decode.loss_ce: 0.3116 decode.acc_seg: 86.6877 aux.loss_ce: 0.1437 aux.acc_seg: 85.8238 2024/10/26 05:02:13 - mmengine - INFO - Iter(train) [58200/80000] base_lr: 6.8459e-05 lr: 6.8459e-05 eta: 11:51:05 time: 1.9571 data_time: 0.0189 memory: 6647 grad_norm: 4.7581 loss: 0.5338 decode.loss_ce: 0.3579 decode.acc_seg: 84.5971 aux.loss_ce: 0.1759 aux.acc_seg: 83.0111 2024/10/26 05:03:51 - mmengine - INFO - Iter(train) [58250/80000] base_lr: 6.8225e-05 lr: 6.8225e-05 eta: 11:49:27 time: 1.9458 data_time: 0.0190 memory: 6645 grad_norm: 5.0335 loss: 0.4896 decode.loss_ce: 0.3195 decode.acc_seg: 91.6599 aux.loss_ce: 0.1701 aux.acc_seg: 87.0467 2024/10/26 05:05:28 - mmengine - INFO - Iter(train) [58300/80000] base_lr: 6.7992e-05 lr: 6.7992e-05 eta: 11:47:49 time: 1.9474 data_time: 0.0184 memory: 6646 grad_norm: 5.7561 loss: 0.5213 decode.loss_ce: 0.3482 decode.acc_seg: 88.5293 aux.loss_ce: 0.1731 aux.acc_seg: 84.7977 2024/10/26 05:07:06 - mmengine - INFO - Iter(train) [58350/80000] base_lr: 6.7758e-05 lr: 6.7758e-05 eta: 11:46:11 time: 1.9649 data_time: 0.0179 memory: 6645 grad_norm: 4.5907 loss: 0.4319 decode.loss_ce: 0.2902 decode.acc_seg: 88.2222 aux.loss_ce: 0.1417 aux.acc_seg: 89.1629 2024/10/26 05:08:43 - mmengine - INFO - Iter(train) [58400/80000] base_lr: 6.7525e-05 lr: 6.7525e-05 eta: 11:44:33 time: 1.9433 data_time: 0.0188 memory: 6644 grad_norm: 4.3078 loss: 0.5567 decode.loss_ce: 0.3699 decode.acc_seg: 86.1654 aux.loss_ce: 0.1868 aux.acc_seg: 85.3037 2024/10/26 05:10:21 - mmengine - INFO - Iter(train) [58450/80000] base_lr: 6.7291e-05 lr: 6.7291e-05 eta: 11:42:55 time: 1.9467 data_time: 0.0186 memory: 6647 grad_norm: 3.9467 loss: 0.4420 decode.loss_ce: 0.2959 decode.acc_seg: 85.8117 aux.loss_ce: 0.1460 aux.acc_seg: 87.9685 2024/10/26 05:11:58 - mmengine - INFO - Iter(train) [58500/80000] base_lr: 6.7057e-05 lr: 6.7057e-05 eta: 11:41:17 time: 1.9439 data_time: 0.0187 memory: 6645 grad_norm: 6.8362 loss: 0.6197 decode.loss_ce: 0.4217 decode.acc_seg: 87.3823 aux.loss_ce: 0.1980 aux.acc_seg: 83.9277 2024/10/26 05:13:36 - mmengine - INFO - Iter(train) [58550/80000] base_lr: 6.6823e-05 lr: 6.6823e-05 eta: 11:39:40 time: 1.9491 data_time: 0.0183 memory: 6645 grad_norm: 5.1115 loss: 0.5364 decode.loss_ce: 0.3527 decode.acc_seg: 89.2273 aux.loss_ce: 0.1837 aux.acc_seg: 84.5408 2024/10/26 05:15:14 - mmengine - INFO - Iter(train) [58600/80000] base_lr: 6.6589e-05 lr: 6.6589e-05 eta: 11:38:02 time: 1.9565 data_time: 0.0189 memory: 6645 grad_norm: 3.4581 loss: 0.4592 decode.loss_ce: 0.3119 decode.acc_seg: 86.8398 aux.loss_ce: 0.1472 aux.acc_seg: 85.0778 2024/10/26 05:16:51 - mmengine - INFO - Iter(train) [58650/80000] base_lr: 6.6354e-05 lr: 6.6354e-05 eta: 11:36:24 time: 1.9413 data_time: 0.0181 memory: 6646 grad_norm: 4.7505 loss: 0.4904 decode.loss_ce: 0.3238 decode.acc_seg: 84.3596 aux.loss_ce: 0.1666 aux.acc_seg: 80.8733 2024/10/26 05:18:29 - mmengine - INFO - Iter(train) [58700/80000] base_lr: 6.6120e-05 lr: 6.6120e-05 eta: 11:34:46 time: 1.9492 data_time: 0.0188 memory: 6646 grad_norm: 5.5522 loss: 0.4624 decode.loss_ce: 0.3122 decode.acc_seg: 86.8884 aux.loss_ce: 0.1502 aux.acc_seg: 83.0751 2024/10/26 05:20:07 - mmengine - INFO - Iter(train) [58750/80000] base_lr: 6.5886e-05 lr: 6.5886e-05 eta: 11:33:08 time: 1.9422 data_time: 0.0192 memory: 6646 grad_norm: 5.1950 loss: 0.5403 decode.loss_ce: 0.3530 decode.acc_seg: 83.9086 aux.loss_ce: 0.1872 aux.acc_seg: 84.5949 2024/10/26 05:21:45 - mmengine - INFO - Iter(train) [58800/80000] base_lr: 6.5651e-05 lr: 6.5651e-05 eta: 11:31:30 time: 1.9442 data_time: 0.0181 memory: 6645 grad_norm: 4.2215 loss: 0.5269 decode.loss_ce: 0.3540 decode.acc_seg: 82.0952 aux.loss_ce: 0.1728 aux.acc_seg: 78.7814 2024/10/26 05:23:22 - mmengine - INFO - Iter(train) [58850/80000] base_lr: 6.5417e-05 lr: 6.5417e-05 eta: 11:29:52 time: 1.9517 data_time: 0.0181 memory: 6645 grad_norm: 4.1457 loss: 0.4834 decode.loss_ce: 0.3161 decode.acc_seg: 84.6654 aux.loss_ce: 0.1673 aux.acc_seg: 80.5392 2024/10/26 05:25:00 - mmengine - INFO - Iter(train) [58900/80000] base_lr: 6.5182e-05 lr: 6.5182e-05 eta: 11:28:14 time: 1.9448 data_time: 0.0195 memory: 6646 grad_norm: 5.8792 loss: 0.5832 decode.loss_ce: 0.3894 decode.acc_seg: 87.7591 aux.loss_ce: 0.1939 aux.acc_seg: 80.3876 2024/10/26 05:26:38 - mmengine - INFO - Iter(train) [58950/80000] base_lr: 6.4947e-05 lr: 6.4947e-05 eta: 11:26:36 time: 1.9480 data_time: 0.0195 memory: 6645 grad_norm: 4.2879 loss: 0.5597 decode.loss_ce: 0.3682 decode.acc_seg: 87.2195 aux.loss_ce: 0.1915 aux.acc_seg: 87.6786 2024/10/26 05:28:15 - mmengine - INFO - Exp name: deeplabv3_mobilemamba_b4-80k_ade20k-512x512_20241024_212005 2024/10/26 05:28:15 - mmengine - INFO - Iter(train) [59000/80000] base_lr: 6.4712e-05 lr: 6.4712e-05 eta: 11:24:58 time: 1.9592 data_time: 0.0190 memory: 6647 grad_norm: 4.1216 loss: 0.4740 decode.loss_ce: 0.3128 decode.acc_seg: 88.5037 aux.loss_ce: 0.1612 aux.acc_seg: 83.5656 2024/10/26 05:29:53 - mmengine - INFO - Iter(train) [59050/80000] base_lr: 6.4477e-05 lr: 6.4477e-05 eta: 11:23:20 time: 1.9430 data_time: 0.0195 memory: 6645 grad_norm: 3.9854 loss: 0.5447 decode.loss_ce: 0.3653 decode.acc_seg: 83.9700 aux.loss_ce: 0.1794 aux.acc_seg: 80.9260 2024/10/26 05:31:31 - mmengine - INFO - Iter(train) [59100/80000] base_lr: 6.4242e-05 lr: 6.4242e-05 eta: 11:21:42 time: 1.9496 data_time: 0.0195 memory: 6645 grad_norm: 4.4521 loss: 0.5074 decode.loss_ce: 0.3369 decode.acc_seg: 90.0105 aux.loss_ce: 0.1705 aux.acc_seg: 87.3975 2024/10/26 05:33:09 - mmengine - INFO - Iter(train) [59150/80000] base_lr: 6.4007e-05 lr: 6.4007e-05 eta: 11:20:05 time: 1.9757 data_time: 0.0188 memory: 6646 grad_norm: 5.2492 loss: 0.4851 decode.loss_ce: 0.3241 decode.acc_seg: 88.3989 aux.loss_ce: 0.1610 aux.acc_seg: 82.7831 2024/10/26 05:34:47 - mmengine - INFO - Iter(train) [59200/80000] base_lr: 6.3772e-05 lr: 6.3772e-05 eta: 11:18:27 time: 1.9560 data_time: 0.0177 memory: 6645 grad_norm: 3.4906 loss: 0.4635 decode.loss_ce: 0.3117 decode.acc_seg: 89.1224 aux.loss_ce: 0.1518 aux.acc_seg: 89.0759 2024/10/26 05:36:24 - mmengine - INFO - Iter(train) [59250/80000] base_lr: 6.3537e-05 lr: 6.3537e-05 eta: 11:16:49 time: 1.9490 data_time: 0.0191 memory: 6646 grad_norm: 4.2067 loss: 0.5330 decode.loss_ce: 0.3576 decode.acc_seg: 86.1658 aux.loss_ce: 0.1753 aux.acc_seg: 87.1544 2024/10/26 05:38:02 - mmengine - INFO - Iter(train) [59300/80000] base_lr: 6.3302e-05 lr: 6.3302e-05 eta: 11:15:11 time: 1.9455 data_time: 0.0188 memory: 6645 grad_norm: 5.5323 loss: 0.4678 decode.loss_ce: 0.3127 decode.acc_seg: 89.1170 aux.loss_ce: 0.1551 aux.acc_seg: 84.8791 2024/10/26 05:39:43 - mmengine - INFO - Iter(train) [59350/80000] base_lr: 6.3066e-05 lr: 6.3066e-05 eta: 11:13:34 time: 1.9553 data_time: 0.0175 memory: 6646 grad_norm: 4.8022 loss: 0.5903 decode.loss_ce: 0.3950 decode.acc_seg: 88.8783 aux.loss_ce: 0.1953 aux.acc_seg: 85.0864 2024/10/26 05:41:20 - mmengine - INFO - Iter(train) [59400/80000] base_lr: 6.2831e-05 lr: 6.2831e-05 eta: 11:11:56 time: 1.9657 data_time: 0.0185 memory: 6646 grad_norm: 5.5411 loss: 0.4203 decode.loss_ce: 0.2762 decode.acc_seg: 91.7216 aux.loss_ce: 0.1442 aux.acc_seg: 92.7349 2024/10/26 05:42:58 - mmengine - INFO - Iter(train) [59450/80000] base_lr: 6.2596e-05 lr: 6.2596e-05 eta: 11:10:18 time: 1.9516 data_time: 0.0188 memory: 6646 grad_norm: 3.7788 loss: 0.4487 decode.loss_ce: 0.2937 decode.acc_seg: 91.1339 aux.loss_ce: 0.1550 aux.acc_seg: 91.6552 2024/10/26 05:44:36 - mmengine - INFO - Iter(train) [59500/80000] base_lr: 6.2360e-05 lr: 6.2360e-05 eta: 11:08:40 time: 1.9580 data_time: 0.0170 memory: 6645 grad_norm: 4.1127 loss: 0.4805 decode.loss_ce: 0.3042 decode.acc_seg: 89.1660 aux.loss_ce: 0.1762 aux.acc_seg: 88.0732 2024/10/26 05:46:13 - mmengine - INFO - Iter(train) [59550/80000] base_lr: 6.2125e-05 lr: 6.2125e-05 eta: 11:07:02 time: 1.9600 data_time: 0.0198 memory: 6645 grad_norm: 4.4220 loss: 0.5074 decode.loss_ce: 0.3294 decode.acc_seg: 88.8715 aux.loss_ce: 0.1780 aux.acc_seg: 83.9818 2024/10/26 05:47:52 - mmengine - INFO - Iter(train) [59600/80000] base_lr: 6.1889e-05 lr: 6.1889e-05 eta: 11:05:25 time: 1.9807 data_time: 0.0172 memory: 6645 grad_norm: 2.8539 loss: 0.5439 decode.loss_ce: 0.3711 decode.acc_seg: 87.1987 aux.loss_ce: 0.1728 aux.acc_seg: 81.8082 2024/10/26 05:49:29 - mmengine - INFO - Iter(train) [59650/80000] base_lr: 6.1654e-05 lr: 6.1654e-05 eta: 11:03:47 time: 1.9471 data_time: 0.0189 memory: 6645 grad_norm: 8.4673 loss: 0.6191 decode.loss_ce: 0.4057 decode.acc_seg: 82.8080 aux.loss_ce: 0.2134 aux.acc_seg: 81.9364 2024/10/26 05:51:07 - mmengine - INFO - Iter(train) [59700/80000] base_lr: 6.1418e-05 lr: 6.1418e-05 eta: 11:02:09 time: 1.9611 data_time: 0.0179 memory: 6645 grad_norm: 6.1646 loss: 0.4204 decode.loss_ce: 0.2766 decode.acc_seg: 81.3565 aux.loss_ce: 0.1438 aux.acc_seg: 80.5190 2024/10/26 05:52:44 - mmengine - INFO - Iter(train) [59750/80000] base_lr: 6.1183e-05 lr: 6.1183e-05 eta: 11:00:31 time: 1.9464 data_time: 0.0185 memory: 6646 grad_norm: 4.4896 loss: 0.5526 decode.loss_ce: 0.3752 decode.acc_seg: 88.7233 aux.loss_ce: 0.1774 aux.acc_seg: 87.8849 2024/10/26 05:54:22 - mmengine - INFO - Iter(train) [59800/80000] base_lr: 6.0947e-05 lr: 6.0947e-05 eta: 10:58:53 time: 1.9395 data_time: 0.0190 memory: 6645 grad_norm: 2.9559 loss: 0.4746 decode.loss_ce: 0.3202 decode.acc_seg: 84.2396 aux.loss_ce: 0.1544 aux.acc_seg: 83.8725 2024/10/26 05:55:59 - mmengine - INFO - Iter(train) [59850/80000] base_lr: 6.0712e-05 lr: 6.0712e-05 eta: 10:57:15 time: 1.9450 data_time: 0.0185 memory: 6645 grad_norm: 6.0305 loss: 0.6062 decode.loss_ce: 0.4030 decode.acc_seg: 80.7004 aux.loss_ce: 0.2032 aux.acc_seg: 78.7677 2024/10/26 05:57:37 - mmengine - INFO - Iter(train) [59900/80000] base_lr: 6.0476e-05 lr: 6.0476e-05 eta: 10:55:37 time: 1.9469 data_time: 0.0183 memory: 6645 grad_norm: 4.1411 loss: 0.4945 decode.loss_ce: 0.3306 decode.acc_seg: 88.9388 aux.loss_ce: 0.1640 aux.acc_seg: 89.3596 2024/10/26 05:59:15 - mmengine - INFO - Iter(train) [59950/80000] base_lr: 6.0240e-05 lr: 6.0240e-05 eta: 10:53:59 time: 1.9542 data_time: 0.0184 memory: 6646 grad_norm: 4.7026 loss: 0.4455 decode.loss_ce: 0.3029 decode.acc_seg: 80.7590 aux.loss_ce: 0.1426 aux.acc_seg: 78.2000 2024/10/26 06:00:52 - mmengine - INFO - Exp name: deeplabv3_mobilemamba_b4-80k_ade20k-512x512_20241024_212005 2024/10/26 06:00:52 - mmengine - INFO - Iter(train) [60000/80000] base_lr: 6.0005e-05 lr: 6.0005e-05 eta: 10:52:21 time: 1.9447 data_time: 0.0189 memory: 6647 grad_norm: 3.9166 loss: 0.5416 decode.loss_ce: 0.3546 decode.acc_seg: 86.3257 aux.loss_ce: 0.1870 aux.acc_seg: 85.6461 2024/10/26 06:02:30 - mmengine - INFO - Iter(train) [60050/80000] base_lr: 5.9769e-05 lr: 5.9769e-05 eta: 10:50:43 time: 1.9707 data_time: 0.0191 memory: 6646 grad_norm: 3.5763 loss: 0.4965 decode.loss_ce: 0.3358 decode.acc_seg: 82.7450 aux.loss_ce: 0.1607 aux.acc_seg: 81.8377 2024/10/26 06:04:08 - mmengine - INFO - Iter(train) [60100/80000] base_lr: 5.9533e-05 lr: 5.9533e-05 eta: 10:49:05 time: 1.9516 data_time: 0.0204 memory: 6646 grad_norm: 3.7981 loss: 0.4551 decode.loss_ce: 0.3051 decode.acc_seg: 82.9833 aux.loss_ce: 0.1500 aux.acc_seg: 80.7071 2024/10/26 06:05:46 - mmengine - INFO - Iter(train) [60150/80000] base_lr: 5.9298e-05 lr: 5.9298e-05 eta: 10:47:27 time: 1.9559 data_time: 0.0183 memory: 6645 grad_norm: 3.6475 loss: 0.4277 decode.loss_ce: 0.2844 decode.acc_seg: 86.3249 aux.loss_ce: 0.1433 aux.acc_seg: 85.4116 2024/10/26 06:07:23 - mmengine - INFO - Iter(train) [60200/80000] base_lr: 5.9062e-05 lr: 5.9062e-05 eta: 10:45:49 time: 1.9606 data_time: 0.0197 memory: 6646 grad_norm: 4.7166 loss: 0.4258 decode.loss_ce: 0.2901 decode.acc_seg: 91.1770 aux.loss_ce: 0.1357 aux.acc_seg: 91.7474 2024/10/26 06:09:01 - mmengine - INFO - Iter(train) [60250/80000] base_lr: 5.8827e-05 lr: 5.8827e-05 eta: 10:44:11 time: 1.9480 data_time: 0.0179 memory: 6646 grad_norm: 3.9437 loss: 0.6086 decode.loss_ce: 0.4087 decode.acc_seg: 82.4845 aux.loss_ce: 0.1999 aux.acc_seg: 76.8313 2024/10/26 06:10:38 - mmengine - INFO - Iter(train) [60300/80000] base_lr: 5.8591e-05 lr: 5.8591e-05 eta: 10:42:33 time: 1.9476 data_time: 0.0195 memory: 6645 grad_norm: 4.9057 loss: 0.4905 decode.loss_ce: 0.3395 decode.acc_seg: 81.7421 aux.loss_ce: 0.1510 aux.acc_seg: 80.6983 2024/10/26 06:12:15 - mmengine - INFO - Iter(train) [60350/80000] base_lr: 5.8356e-05 lr: 5.8356e-05 eta: 10:40:55 time: 1.9495 data_time: 0.0181 memory: 6645 grad_norm: 4.4877 loss: 0.5669 decode.loss_ce: 0.3773 decode.acc_seg: 88.4588 aux.loss_ce: 0.1897 aux.acc_seg: 85.8981 2024/10/26 06:13:53 - mmengine - INFO - Iter(train) [60400/80000] base_lr: 5.8120e-05 lr: 5.8120e-05 eta: 10:39:17 time: 1.9472 data_time: 0.0185 memory: 6647 grad_norm: 5.7898 loss: 0.5787 decode.loss_ce: 0.3777 decode.acc_seg: 85.7570 aux.loss_ce: 0.2010 aux.acc_seg: 78.9721 2024/10/26 06:15:31 - mmengine - INFO - Iter(train) [60450/80000] base_lr: 5.7885e-05 lr: 5.7885e-05 eta: 10:37:40 time: 1.9410 data_time: 0.0185 memory: 6646 grad_norm: 5.9092 loss: 0.5401 decode.loss_ce: 0.3598 decode.acc_seg: 82.4891 aux.loss_ce: 0.1803 aux.acc_seg: 82.0358 2024/10/26 06:17:08 - mmengine - INFO - Iter(train) [60500/80000] base_lr: 5.7649e-05 lr: 5.7649e-05 eta: 10:36:02 time: 1.9426 data_time: 0.0190 memory: 6645 grad_norm: 3.8926 loss: 0.5098 decode.loss_ce: 0.3432 decode.acc_seg: 78.2698 aux.loss_ce: 0.1666 aux.acc_seg: 76.7986 2024/10/26 06:18:46 - mmengine - INFO - Iter(train) [60550/80000] base_lr: 5.7414e-05 lr: 5.7414e-05 eta: 10:34:24 time: 1.9485 data_time: 0.0178 memory: 6648 grad_norm: 3.3949 loss: 0.5539 decode.loss_ce: 0.3623 decode.acc_seg: 83.5570 aux.loss_ce: 0.1917 aux.acc_seg: 83.3540 2024/10/26 06:20:24 - mmengine - INFO - Iter(train) [60600/80000] base_lr: 5.7178e-05 lr: 5.7178e-05 eta: 10:32:46 time: 1.9420 data_time: 0.0185 memory: 6646 grad_norm: 4.4377 loss: 0.5543 decode.loss_ce: 0.3574 decode.acc_seg: 81.4505 aux.loss_ce: 0.1969 aux.acc_seg: 79.3702 2024/10/26 06:22:01 - mmengine - INFO - Iter(train) [60650/80000] base_lr: 5.6943e-05 lr: 5.6943e-05 eta: 10:31:08 time: 1.9621 data_time: 0.0184 memory: 6646 grad_norm: 4.5004 loss: 0.5491 decode.loss_ce: 0.3612 decode.acc_seg: 82.3887 aux.loss_ce: 0.1880 aux.acc_seg: 75.7071 2024/10/26 06:23:44 - mmengine - INFO - Iter(train) [60700/80000] base_lr: 5.6708e-05 lr: 5.6708e-05 eta: 10:29:31 time: 1.9507 data_time: 0.0193 memory: 6646 grad_norm: 4.2625 loss: 0.4603 decode.loss_ce: 0.3119 decode.acc_seg: 86.0047 aux.loss_ce: 0.1484 aux.acc_seg: 85.3879 2024/10/26 06:25:21 - mmengine - INFO - Iter(train) [60750/80000] base_lr: 5.6472e-05 lr: 5.6472e-05 eta: 10:27:53 time: 1.9441 data_time: 0.0189 memory: 6645 grad_norm: 4.2580 loss: 0.4993 decode.loss_ce: 0.3376 decode.acc_seg: 85.6217 aux.loss_ce: 0.1617 aux.acc_seg: 74.0407 2024/10/26 06:26:58 - mmengine - INFO - Iter(train) [60800/80000] base_lr: 5.6237e-05 lr: 5.6237e-05 eta: 10:26:15 time: 1.9439 data_time: 0.0187 memory: 6646 grad_norm: 4.1034 loss: 0.5250 decode.loss_ce: 0.3400 decode.acc_seg: 90.9424 aux.loss_ce: 0.1850 aux.acc_seg: 89.9841 2024/10/26 06:28:36 - mmengine - INFO - Iter(train) [60850/80000] base_lr: 5.6002e-05 lr: 5.6002e-05 eta: 10:24:37 time: 1.9511 data_time: 0.0187 memory: 6646 grad_norm: 3.4396 loss: 0.5422 decode.loss_ce: 0.3534 decode.acc_seg: 87.4637 aux.loss_ce: 0.1888 aux.acc_seg: 80.2013 2024/10/26 06:30:13 - mmengine - INFO - Iter(train) [60900/80000] base_lr: 5.5767e-05 lr: 5.5767e-05 eta: 10:22:59 time: 1.9495 data_time: 0.0189 memory: 6646 grad_norm: 5.5244 loss: 0.5324 decode.loss_ce: 0.3417 decode.acc_seg: 87.6147 aux.loss_ce: 0.1907 aux.acc_seg: 86.8921 2024/10/26 06:31:51 - mmengine - INFO - Iter(train) [60950/80000] base_lr: 5.5532e-05 lr: 5.5532e-05 eta: 10:21:22 time: 1.9430 data_time: 0.0196 memory: 6645 grad_norm: 3.2390 loss: 0.4838 decode.loss_ce: 0.3221 decode.acc_seg: 92.7365 aux.loss_ce: 0.1616 aux.acc_seg: 91.5500 2024/10/26 06:33:29 - mmengine - INFO - Exp name: deeplabv3_mobilemamba_b4-80k_ade20k-512x512_20241024_212005 2024/10/26 06:33:29 - mmengine - INFO - Iter(train) [61000/80000] base_lr: 5.5297e-05 lr: 5.5297e-05 eta: 10:19:44 time: 1.9481 data_time: 0.0180 memory: 6646 grad_norm: 3.5062 loss: 0.4692 decode.loss_ce: 0.3054 decode.acc_seg: 87.7838 aux.loss_ce: 0.1638 aux.acc_seg: 87.8660 2024/10/26 06:35:07 - mmengine - INFO - Iter(train) [61050/80000] base_lr: 5.5062e-05 lr: 5.5062e-05 eta: 10:18:06 time: 1.9520 data_time: 0.0197 memory: 6646 grad_norm: 3.8928 loss: 0.4995 decode.loss_ce: 0.3350 decode.acc_seg: 84.1500 aux.loss_ce: 0.1645 aux.acc_seg: 82.2449 2024/10/26 06:36:44 - mmengine - INFO - Iter(train) [61100/80000] base_lr: 5.4828e-05 lr: 5.4828e-05 eta: 10:16:28 time: 1.9560 data_time: 0.0175 memory: 6646 grad_norm: 4.8623 loss: 0.5025 decode.loss_ce: 0.3418 decode.acc_seg: 87.0883 aux.loss_ce: 0.1606 aux.acc_seg: 87.0061 2024/10/26 06:38:22 - mmengine - INFO - Iter(train) [61150/80000] base_lr: 5.4593e-05 lr: 5.4593e-05 eta: 10:14:50 time: 1.9739 data_time: 0.0184 memory: 6646 grad_norm: 6.9524 loss: 0.4612 decode.loss_ce: 0.3085 decode.acc_seg: 83.0323 aux.loss_ce: 0.1527 aux.acc_seg: 83.9323 2024/10/26 06:40:00 - mmengine - INFO - Iter(train) [61200/80000] base_lr: 5.4358e-05 lr: 5.4358e-05 eta: 10:13:12 time: 1.9510 data_time: 0.0166 memory: 6646 grad_norm: 4.2048 loss: 0.5046 decode.loss_ce: 0.3431 decode.acc_seg: 85.7810 aux.loss_ce: 0.1615 aux.acc_seg: 86.2957 2024/10/26 06:41:38 - mmengine - INFO - Iter(train) [61250/80000] base_lr: 5.4124e-05 lr: 5.4124e-05 eta: 10:11:34 time: 1.9475 data_time: 0.0176 memory: 6645 grad_norm: 3.9972 loss: 0.4296 decode.loss_ce: 0.2824 decode.acc_seg: 85.3520 aux.loss_ce: 0.1472 aux.acc_seg: 86.6659 2024/10/26 06:43:16 - mmengine - INFO - Iter(train) [61300/80000] base_lr: 5.3889e-05 lr: 5.3889e-05 eta: 10:09:56 time: 1.9524 data_time: 0.0197 memory: 6645 grad_norm: 3.4175 loss: 0.4312 decode.loss_ce: 0.2821 decode.acc_seg: 84.3029 aux.loss_ce: 0.1491 aux.acc_seg: 79.6844 2024/10/26 06:44:54 - mmengine - INFO - Iter(train) [61350/80000] base_lr: 5.3655e-05 lr: 5.3655e-05 eta: 10:08:19 time: 1.9555 data_time: 0.0189 memory: 6645 grad_norm: 3.4803 loss: 0.4527 decode.loss_ce: 0.2977 decode.acc_seg: 89.9799 aux.loss_ce: 0.1550 aux.acc_seg: 86.5631 2024/10/26 06:46:31 - mmengine - INFO - Iter(train) [61400/80000] base_lr: 5.3421e-05 lr: 5.3421e-05 eta: 10:06:41 time: 1.9559 data_time: 0.0195 memory: 6646 grad_norm: 2.9969 loss: 0.4756 decode.loss_ce: 0.3199 decode.acc_seg: 90.0504 aux.loss_ce: 0.1557 aux.acc_seg: 85.1849 2024/10/26 06:48:09 - mmengine - INFO - Iter(train) [61450/80000] base_lr: 5.3186e-05 lr: 5.3186e-05 eta: 10:05:03 time: 1.9541 data_time: 0.0189 memory: 6646 grad_norm: 4.5940 loss: 0.5612 decode.loss_ce: 0.3812 decode.acc_seg: 83.8428 aux.loss_ce: 0.1799 aux.acc_seg: 80.8762 2024/10/26 06:49:46 - mmengine - INFO - Iter(train) [61500/80000] base_lr: 5.2952e-05 lr: 5.2952e-05 eta: 10:03:25 time: 1.9496 data_time: 0.0194 memory: 6646 grad_norm: 2.7805 loss: 0.5242 decode.loss_ce: 0.3490 decode.acc_seg: 90.3676 aux.loss_ce: 0.1752 aux.acc_seg: 89.8945 2024/10/26 06:51:24 - mmengine - INFO - Iter(train) [61550/80000] base_lr: 5.2719e-05 lr: 5.2719e-05 eta: 10:01:47 time: 1.9549 data_time: 0.0190 memory: 6646 grad_norm: 4.7114 loss: 0.4705 decode.loss_ce: 0.3156 decode.acc_seg: 87.4790 aux.loss_ce: 0.1549 aux.acc_seg: 86.5569 2024/10/26 06:53:02 - mmengine - INFO - Iter(train) [61600/80000] base_lr: 5.2485e-05 lr: 5.2485e-05 eta: 10:00:09 time: 1.9480 data_time: 0.0193 memory: 6646 grad_norm: 3.9899 loss: 0.4020 decode.loss_ce: 0.2603 decode.acc_seg: 92.9669 aux.loss_ce: 0.1417 aux.acc_seg: 91.8344 2024/10/26 06:54:44 - mmengine - INFO - Iter(train) [61650/80000] base_lr: 5.2251e-05 lr: 5.2251e-05 eta: 9:58:33 time: 1.9626 data_time: 0.0187 memory: 6645 grad_norm: 3.7771 loss: 0.5041 decode.loss_ce: 0.3267 decode.acc_seg: 84.9284 aux.loss_ce: 0.1774 aux.acc_seg: 84.1359 2024/10/26 06:56:22 - mmengine - INFO - Iter(train) [61700/80000] base_lr: 5.2017e-05 lr: 5.2017e-05 eta: 9:56:55 time: 1.9500 data_time: 0.0195 memory: 6645 grad_norm: 2.9804 loss: 0.5231 decode.loss_ce: 0.3429 decode.acc_seg: 84.9898 aux.loss_ce: 0.1802 aux.acc_seg: 80.1265 2024/10/26 06:58:01 - mmengine - INFO - Iter(train) [61750/80000] base_lr: 5.1784e-05 lr: 5.1784e-05 eta: 9:55:17 time: 1.9777 data_time: 0.0183 memory: 6645 grad_norm: 5.8347 loss: 0.5237 decode.loss_ce: 0.3549 decode.acc_seg: 87.8808 aux.loss_ce: 0.1689 aux.acc_seg: 84.4577 2024/10/26 06:59:38 - mmengine - INFO - Iter(train) [61800/80000] base_lr: 5.1551e-05 lr: 5.1551e-05 eta: 9:53:39 time: 1.9541 data_time: 0.0198 memory: 6645 grad_norm: 4.6666 loss: 0.4412 decode.loss_ce: 0.2930 decode.acc_seg: 89.5201 aux.loss_ce: 0.1482 aux.acc_seg: 87.6898 2024/10/26 07:01:16 - mmengine - INFO - Iter(train) [61850/80000] base_lr: 5.1317e-05 lr: 5.1317e-05 eta: 9:52:01 time: 1.9648 data_time: 0.0197 memory: 6646 grad_norm: 4.4479 loss: 0.4695 decode.loss_ce: 0.3177 decode.acc_seg: 85.9613 aux.loss_ce: 0.1518 aux.acc_seg: 85.2587 2024/10/26 07:02:54 - mmengine - INFO - Iter(train) [61900/80000] base_lr: 5.1084e-05 lr: 5.1084e-05 eta: 9:50:23 time: 1.9561 data_time: 0.0195 memory: 6646 grad_norm: 4.8597 loss: 0.4515 decode.loss_ce: 0.3049 decode.acc_seg: 89.3299 aux.loss_ce: 0.1467 aux.acc_seg: 86.4460 2024/10/26 07:04:32 - mmengine - INFO - Iter(train) [61950/80000] base_lr: 5.0851e-05 lr: 5.0851e-05 eta: 9:48:46 time: 1.9464 data_time: 0.0191 memory: 6645 grad_norm: 3.8665 loss: 0.5362 decode.loss_ce: 0.3473 decode.acc_seg: 83.6244 aux.loss_ce: 0.1889 aux.acc_seg: 72.8204 2024/10/26 07:06:10 - mmengine - INFO - Exp name: deeplabv3_mobilemamba_b4-80k_ade20k-512x512_20241024_212005 2024/10/26 07:06:10 - mmengine - INFO - Iter(train) [62000/80000] base_lr: 5.0619e-05 lr: 5.0619e-05 eta: 9:47:08 time: 1.9598 data_time: 0.0182 memory: 6645 grad_norm: 2.6818 loss: 0.3936 decode.loss_ce: 0.2666 decode.acc_seg: 90.7588 aux.loss_ce: 0.1270 aux.acc_seg: 86.8699 2024/10/26 07:07:48 - mmengine - INFO - Iter(train) [62050/80000] base_lr: 5.0386e-05 lr: 5.0386e-05 eta: 9:45:30 time: 1.9528 data_time: 0.0190 memory: 6645 grad_norm: 4.4685 loss: 0.4190 decode.loss_ce: 0.2773 decode.acc_seg: 87.4179 aux.loss_ce: 0.1417 aux.acc_seg: 89.1478 2024/10/26 07:09:26 - mmengine - INFO - Iter(train) [62100/80000] base_lr: 5.0153e-05 lr: 5.0153e-05 eta: 9:43:52 time: 1.9543 data_time: 0.0188 memory: 6645 grad_norm: 3.9040 loss: 0.4495 decode.loss_ce: 0.2954 decode.acc_seg: 85.8747 aux.loss_ce: 0.1541 aux.acc_seg: 85.5181 2024/10/26 07:11:03 - mmengine - INFO - Iter(train) [62150/80000] base_lr: 4.9921e-05 lr: 4.9921e-05 eta: 9:42:14 time: 1.9424 data_time: 0.0192 memory: 6645 grad_norm: 3.1696 loss: 0.5352 decode.loss_ce: 0.3494 decode.acc_seg: 84.9857 aux.loss_ce: 0.1859 aux.acc_seg: 78.1954 2024/10/26 07:12:43 - mmengine - INFO - Iter(train) [62200/80000] base_lr: 4.9689e-05 lr: 4.9689e-05 eta: 9:40:37 time: 1.9565 data_time: 0.0199 memory: 6645 grad_norm: 5.5647 loss: 0.5934 decode.loss_ce: 0.3945 decode.acc_seg: 85.4412 aux.loss_ce: 0.1989 aux.acc_seg: 85.0500 2024/10/26 07:14:21 - mmengine - INFO - Iter(train) [62250/80000] base_lr: 4.9457e-05 lr: 4.9457e-05 eta: 9:38:59 time: 1.9556 data_time: 0.0197 memory: 6645 grad_norm: 3.8477 loss: 0.4668 decode.loss_ce: 0.3036 decode.acc_seg: 91.4155 aux.loss_ce: 0.1632 aux.acc_seg: 88.3894 2024/10/26 07:15:59 - mmengine - INFO - Iter(train) [62300/80000] base_lr: 4.9225e-05 lr: 4.9225e-05 eta: 9:37:21 time: 1.9568 data_time: 0.0191 memory: 6645 grad_norm: 3.9007 loss: 0.4968 decode.loss_ce: 0.3323 decode.acc_seg: 85.7599 aux.loss_ce: 0.1645 aux.acc_seg: 86.0375 2024/10/26 07:17:37 - mmengine - INFO - Iter(train) [62350/80000] base_lr: 4.8993e-05 lr: 4.8993e-05 eta: 9:35:43 time: 1.9587 data_time: 0.0187 memory: 6645 grad_norm: 2.8618 loss: 0.4685 decode.loss_ce: 0.3041 decode.acc_seg: 85.7405 aux.loss_ce: 0.1645 aux.acc_seg: 80.5175 2024/10/26 07:19:15 - mmengine - INFO - Iter(train) [62400/80000] base_lr: 4.8762e-05 lr: 4.8762e-05 eta: 9:34:05 time: 1.9521 data_time: 0.0187 memory: 6645 grad_norm: 4.1101 loss: 0.5130 decode.loss_ce: 0.3405 decode.acc_seg: 85.2527 aux.loss_ce: 0.1726 aux.acc_seg: 83.3648 2024/10/26 07:20:52 - mmengine - INFO - Iter(train) [62450/80000] base_lr: 4.8530e-05 lr: 4.8530e-05 eta: 9:32:27 time: 1.9484 data_time: 0.0186 memory: 6646 grad_norm: 3.9969 loss: 0.5266 decode.loss_ce: 0.3512 decode.acc_seg: 81.6902 aux.loss_ce: 0.1754 aux.acc_seg: 77.5671 2024/10/26 07:22:30 - mmengine - INFO - Iter(train) [62500/80000] base_lr: 4.8299e-05 lr: 4.8299e-05 eta: 9:30:49 time: 1.9556 data_time: 0.0189 memory: 6647 grad_norm: 3.3927 loss: 0.4973 decode.loss_ce: 0.3321 decode.acc_seg: 87.7524 aux.loss_ce: 0.1652 aux.acc_seg: 87.0403 2024/10/26 07:24:08 - mmengine - INFO - Iter(train) [62550/80000] base_lr: 4.8068e-05 lr: 4.8068e-05 eta: 9:29:12 time: 1.9488 data_time: 0.0189 memory: 6646 grad_norm: 5.0717 loss: 0.5277 decode.loss_ce: 0.3611 decode.acc_seg: 87.6190 aux.loss_ce: 0.1666 aux.acc_seg: 85.3810 2024/10/26 07:25:45 - mmengine - INFO - Iter(train) [62600/80000] base_lr: 4.7837e-05 lr: 4.7837e-05 eta: 9:27:34 time: 1.9686 data_time: 0.0197 memory: 6645 grad_norm: 4.8155 loss: 0.4633 decode.loss_ce: 0.3018 decode.acc_seg: 82.1242 aux.loss_ce: 0.1615 aux.acc_seg: 81.7300 2024/10/26 07:27:23 - mmengine - INFO - Iter(train) [62650/80000] base_lr: 4.7607e-05 lr: 4.7607e-05 eta: 9:25:56 time: 1.9564 data_time: 0.0191 memory: 6646 grad_norm: 3.3882 loss: 0.5136 decode.loss_ce: 0.3409 decode.acc_seg: 89.2123 aux.loss_ce: 0.1727 aux.acc_seg: 90.7284 2024/10/26 07:29:01 - mmengine - INFO - Iter(train) [62700/80000] base_lr: 4.7376e-05 lr: 4.7376e-05 eta: 9:24:18 time: 1.9488 data_time: 0.0192 memory: 6646 grad_norm: 4.0596 loss: 0.4506 decode.loss_ce: 0.3034 decode.acc_seg: 83.6343 aux.loss_ce: 0.1472 aux.acc_seg: 84.3368 2024/10/26 07:30:39 - mmengine - INFO - Iter(train) [62750/80000] base_lr: 4.7146e-05 lr: 4.7146e-05 eta: 9:22:40 time: 1.9469 data_time: 0.0187 memory: 6645 grad_norm: 4.1655 loss: 0.4583 decode.loss_ce: 0.3023 decode.acc_seg: 81.2768 aux.loss_ce: 0.1561 aux.acc_seg: 81.3074 2024/10/26 07:32:16 - mmengine - INFO - Iter(train) [62800/80000] base_lr: 4.6916e-05 lr: 4.6916e-05 eta: 9:21:02 time: 1.9580 data_time: 0.0179 memory: 6646 grad_norm: 4.0046 loss: 0.5546 decode.loss_ce: 0.3655 decode.acc_seg: 86.1078 aux.loss_ce: 0.1890 aux.acc_seg: 85.3227 2024/10/26 07:33:54 - mmengine - INFO - Iter(train) [62850/80000] base_lr: 4.6686e-05 lr: 4.6686e-05 eta: 9:19:24 time: 1.9638 data_time: 0.0177 memory: 6645 grad_norm: 6.9417 loss: 0.5295 decode.loss_ce: 0.3473 decode.acc_seg: 85.5411 aux.loss_ce: 0.1822 aux.acc_seg: 81.3864 2024/10/26 07:35:32 - mmengine - INFO - Iter(train) [62900/80000] base_lr: 4.6457e-05 lr: 4.6457e-05 eta: 9:17:46 time: 1.9482 data_time: 0.0188 memory: 6645 grad_norm: 3.9786 loss: 0.4792 decode.loss_ce: 0.3120 decode.acc_seg: 90.6731 aux.loss_ce: 0.1672 aux.acc_seg: 90.5842 2024/10/26 07:37:10 - mmengine - INFO - Iter(train) [62950/80000] base_lr: 4.6227e-05 lr: 4.6227e-05 eta: 9:16:08 time: 1.9557 data_time: 0.0188 memory: 6646 grad_norm: 4.6540 loss: 0.5237 decode.loss_ce: 0.3304 decode.acc_seg: 86.7039 aux.loss_ce: 0.1933 aux.acc_seg: 83.5822 2024/10/26 07:38:47 - mmengine - INFO - Exp name: deeplabv3_mobilemamba_b4-80k_ade20k-512x512_20241024_212005 2024/10/26 07:38:47 - mmengine - INFO - Iter(train) [63000/80000] base_lr: 4.5998e-05 lr: 4.5998e-05 eta: 9:14:31 time: 1.9497 data_time: 0.0187 memory: 6646 grad_norm: 4.6142 loss: 0.4841 decode.loss_ce: 0.3221 decode.acc_seg: 91.4910 aux.loss_ce: 0.1620 aux.acc_seg: 89.6019 2024/10/26 07:40:25 - mmengine - INFO - Iter(train) [63050/80000] base_lr: 4.5769e-05 lr: 4.5769e-05 eta: 9:12:53 time: 1.9564 data_time: 0.0178 memory: 6647 grad_norm: 4.1327 loss: 0.5122 decode.loss_ce: 0.3417 decode.acc_seg: 84.6658 aux.loss_ce: 0.1705 aux.acc_seg: 79.4011 2024/10/26 07:42:03 - mmengine - INFO - Iter(train) [63100/80000] base_lr: 4.5540e-05 lr: 4.5540e-05 eta: 9:11:15 time: 1.9622 data_time: 0.0188 memory: 6645 grad_norm: 6.4234 loss: 0.5284 decode.loss_ce: 0.3396 decode.acc_seg: 82.8932 aux.loss_ce: 0.1888 aux.acc_seg: 75.8188 2024/10/26 07:43:44 - mmengine - INFO - Iter(train) [63150/80000] base_lr: 4.5312e-05 lr: 4.5312e-05 eta: 9:09:38 time: 1.9711 data_time: 0.0187 memory: 6646 grad_norm: 3.3090 loss: 0.4364 decode.loss_ce: 0.2853 decode.acc_seg: 88.6410 aux.loss_ce: 0.1511 aux.acc_seg: 88.9988 2024/10/26 07:45:22 - mmengine - INFO - Iter(train) [63200/80000] base_lr: 4.5083e-05 lr: 4.5083e-05 eta: 9:08:00 time: 1.9501 data_time: 0.0176 memory: 6645 grad_norm: 5.7632 loss: 0.5548 decode.loss_ce: 0.3710 decode.acc_seg: 75.6794 aux.loss_ce: 0.1838 aux.acc_seg: 71.7313 2024/10/26 07:46:59 - mmengine - INFO - Iter(train) [63250/80000] base_lr: 4.4855e-05 lr: 4.4855e-05 eta: 9:06:22 time: 1.9525 data_time: 0.0185 memory: 6645 grad_norm: 3.8860 loss: 0.4158 decode.loss_ce: 0.2674 decode.acc_seg: 89.9525 aux.loss_ce: 0.1484 aux.acc_seg: 79.0737 2024/10/26 07:48:37 - mmengine - INFO - Iter(train) [63300/80000] base_lr: 4.4627e-05 lr: 4.4627e-05 eta: 9:04:44 time: 1.9542 data_time: 0.0187 memory: 6647 grad_norm: 3.8189 loss: 0.5221 decode.loss_ce: 0.3501 decode.acc_seg: 83.4592 aux.loss_ce: 0.1720 aux.acc_seg: 75.4672 2024/10/26 07:50:15 - mmengine - INFO - Iter(train) [63350/80000] base_lr: 4.4400e-05 lr: 4.4400e-05 eta: 9:03:06 time: 1.9568 data_time: 0.0188 memory: 6646 grad_norm: 3.8059 loss: 0.4530 decode.loss_ce: 0.3023 decode.acc_seg: 90.0051 aux.loss_ce: 0.1508 aux.acc_seg: 87.7335 2024/10/26 07:51:52 - mmengine - INFO - Iter(train) [63400/80000] base_lr: 4.4172e-05 lr: 4.4172e-05 eta: 9:01:28 time: 1.9528 data_time: 0.0186 memory: 6645 grad_norm: 4.8071 loss: 0.4606 decode.loss_ce: 0.3042 decode.acc_seg: 90.8398 aux.loss_ce: 0.1563 aux.acc_seg: 88.4263 2024/10/26 07:53:30 - mmengine - INFO - Iter(train) [63450/80000] base_lr: 4.3945e-05 lr: 4.3945e-05 eta: 8:59:50 time: 1.9444 data_time: 0.0189 memory: 6645 grad_norm: 6.4520 loss: 0.5446 decode.loss_ce: 0.3584 decode.acc_seg: 72.3308 aux.loss_ce: 0.1862 aux.acc_seg: 69.2300 2024/10/26 07:55:07 - mmengine - INFO - Iter(train) [63500/80000] base_lr: 4.3718e-05 lr: 4.3718e-05 eta: 8:58:12 time: 1.9479 data_time: 0.0183 memory: 6646 grad_norm: 3.5855 loss: 0.4764 decode.loss_ce: 0.3210 decode.acc_seg: 81.9920 aux.loss_ce: 0.1554 aux.acc_seg: 77.2310 2024/10/26 07:56:46 - mmengine - INFO - Iter(train) [63550/80000] base_lr: 4.3491e-05 lr: 4.3491e-05 eta: 8:56:35 time: 1.9538 data_time: 0.0189 memory: 6647 grad_norm: 4.0276 loss: 0.5045 decode.loss_ce: 0.3381 decode.acc_seg: 88.5538 aux.loss_ce: 0.1663 aux.acc_seg: 86.7267 2024/10/26 07:58:23 - mmengine - INFO - Iter(train) [63600/80000] base_lr: 4.3265e-05 lr: 4.3265e-05 eta: 8:54:57 time: 1.9504 data_time: 0.0190 memory: 6646 grad_norm: 4.5793 loss: 0.4823 decode.loss_ce: 0.3153 decode.acc_seg: 90.8210 aux.loss_ce: 0.1670 aux.acc_seg: 86.8066 2024/10/26 08:00:01 - mmengine - INFO - Iter(train) [63650/80000] base_lr: 4.3039e-05 lr: 4.3039e-05 eta: 8:53:19 time: 1.9571 data_time: 0.0184 memory: 6645 grad_norm: 4.1320 loss: 0.4170 decode.loss_ce: 0.2780 decode.acc_seg: 90.8119 aux.loss_ce: 0.1390 aux.acc_seg: 89.4452 2024/10/26 08:01:43 - mmengine - INFO - Iter(train) [63700/80000] base_lr: 4.2813e-05 lr: 4.2813e-05 eta: 8:51:42 time: 1.9505 data_time: 0.0188 memory: 6645 grad_norm: 4.4511 loss: 0.4439 decode.loss_ce: 0.2880 decode.acc_seg: 90.5825 aux.loss_ce: 0.1558 aux.acc_seg: 88.7513 2024/10/26 08:03:20 - mmengine - INFO - Iter(train) [63750/80000] base_lr: 4.2587e-05 lr: 4.2587e-05 eta: 8:50:04 time: 1.9512 data_time: 0.0184 memory: 6645 grad_norm: 4.6564 loss: 0.4371 decode.loss_ce: 0.2963 decode.acc_seg: 91.4998 aux.loss_ce: 0.1408 aux.acc_seg: 89.0841 2024/10/26 08:04:58 - mmengine - INFO - Iter(train) [63800/80000] base_lr: 4.2362e-05 lr: 4.2362e-05 eta: 8:48:26 time: 1.9401 data_time: 0.0179 memory: 6645 grad_norm: 4.1979 loss: 0.5163 decode.loss_ce: 0.3406 decode.acc_seg: 92.0080 aux.loss_ce: 0.1757 aux.acc_seg: 90.1624 2024/10/26 08:06:35 - mmengine - INFO - Iter(train) [63850/80000] base_lr: 4.2137e-05 lr: 4.2137e-05 eta: 8:46:48 time: 1.9654 data_time: 0.0169 memory: 6646 grad_norm: 4.7099 loss: 0.5166 decode.loss_ce: 0.3440 decode.acc_seg: 81.6800 aux.loss_ce: 0.1726 aux.acc_seg: 80.2957 2024/10/26 08:08:13 - mmengine - INFO - Iter(train) [63900/80000] base_lr: 4.1912e-05 lr: 4.1912e-05 eta: 8:45:10 time: 1.9517 data_time: 0.0192 memory: 6647 grad_norm: 5.1752 loss: 0.5119 decode.loss_ce: 0.3407 decode.acc_seg: 86.1632 aux.loss_ce: 0.1711 aux.acc_seg: 85.2540 2024/10/26 08:09:51 - mmengine - INFO - Iter(train) [63950/80000] base_lr: 4.1688e-05 lr: 4.1688e-05 eta: 8:43:32 time: 1.9495 data_time: 0.0192 memory: 6646 grad_norm: 4.9043 loss: 0.3953 decode.loss_ce: 0.2596 decode.acc_seg: 89.6284 aux.loss_ce: 0.1357 aux.acc_seg: 90.1466 2024/10/26 08:11:29 - mmengine - INFO - Exp name: deeplabv3_mobilemamba_b4-80k_ade20k-512x512_20241024_212005 2024/10/26 08:11:29 - mmengine - INFO - Iter(train) [64000/80000] base_lr: 4.1463e-05 lr: 4.1463e-05 eta: 8:41:54 time: 1.9517 data_time: 0.0180 memory: 6646 grad_norm: 3.1823 loss: 0.4437 decode.loss_ce: 0.2960 decode.acc_seg: 85.1222 aux.loss_ce: 0.1477 aux.acc_seg: 82.0999 2024/10/26 08:11:29 - mmengine - INFO - Saving checkpoint at 64000 iterations 2024/10/26 08:11:34 - mmengine - INFO - Iter(val) [ 50/500] eta: 0:00:16 time: 0.0380 data_time: 0.0019 memory: 1049 2024/10/26 08:11:36 - mmengine - INFO - Iter(val) [100/500] eta: 0:00:15 time: 0.0382 data_time: 0.0018 memory: 1117 2024/10/26 08:11:37 - mmengine - INFO - Iter(val) [150/500] eta: 0:00:13 time: 0.0410 data_time: 0.0020 memory: 833 2024/10/26 08:11:39 - mmengine - INFO - Iter(val) [200/500] eta: 0:00:11 time: 0.0371 data_time: 0.0017 memory: 866 2024/10/26 08:11:41 - mmengine - INFO - Iter(val) [250/500] eta: 0:00:09 time: 0.0369 data_time: 0.0019 memory: 906 2024/10/26 08:11:43 - mmengine - INFO - Iter(val) [300/500] eta: 0:00:07 time: 0.0338 data_time: 0.0022 memory: 2028 2024/10/26 08:11:45 - mmengine - INFO - Iter(val) [350/500] eta: 0:00:05 time: 0.0332 data_time: 0.0018 memory: 832 2024/10/26 08:11:46 - mmengine - INFO - Iter(val) [400/500] eta: 0:00:03 time: 0.0352 data_time: 0.0018 memory: 904 2024/10/26 08:11:48 - mmengine - INFO - Iter(val) [450/500] eta: 0:00:01 time: 0.0324 data_time: 0.0015 memory: 839 2024/10/26 08:11:49 - mmengine - INFO - Iter(val) [500/500] eta: 0:00:00 time: 0.0324 data_time: 0.0014 memory: 889 2024/10/26 08:11:50 - mmengine - INFO - per class results: 2024/10/26 08:11:50 - mmengine - INFO - +---------------------+-------+-------+ | Class | IoU | Acc | +---------------------+-------+-------+ | wall | 66.75 | 83.94 | | building | 76.19 | 90.0 | | sky | 88.73 | 94.51 | | floor | 70.69 | 85.03 | | tree | 65.81 | 83.49 | | ceiling | 76.5 | 85.4 | | road | 76.26 | 86.45 | | bed | 79.99 | 88.36 | | windowpane | 51.9 | 68.28 | | grass | 59.14 | 78.88 | | cabinet | 50.33 | 63.95 | | sidewalk | 55.12 | 70.43 | | person | 60.22 | 76.87 | | earth | 32.25 | 46.25 | | door | 35.67 | 47.9 | | table | 43.72 | 59.19 | | mountain | 57.72 | 71.39 | | plant | 40.58 | 50.52 | | curtain | 54.27 | 73.08 | | chair | 38.69 | 54.03 | | car | 69.22 | 80.62 | | water | 46.27 | 61.88 | | painting | 49.71 | 66.95 | | sofa | 56.26 | 73.33 | | shelf | 30.7 | 43.2 | | house | 39.9 | 52.57 | | sea | 38.59 | 56.32 | | mirror | 51.46 | 62.21 | | rug | 48.1 | 63.96 | | field | 25.92 | 38.05 | | armchair | 34.9 | 53.64 | | seat | 51.23 | 72.97 | | fence | 33.56 | 45.45 | | desk | 38.06 | 50.91 | | rock | 33.99 | 56.81 | | wardrobe | 41.35 | 56.27 | | lamp | 32.08 | 40.94 | | bathtub | 68.65 | 76.58 | | railing | 24.86 | 35.25 | | cushion | 35.58 | 50.55 | | base | 22.68 | 30.98 | | box | 15.59 | 24.36 | | column | 23.45 | 28.89 | | signboard | 18.18 | 26.3 | | chest of drawers | 30.04 | 49.88 | | counter | 31.3 | 40.66 | | sand | 30.58 | 51.59 | | sink | 51.64 | 66.3 | | skyscraper | 44.5 | 62.13 | | fireplace | 60.75 | 75.98 | | refrigerator | 68.78 | 82.02 | | grandstand | 42.22 | 56.12 | | path | 20.27 | 31.25 | | stairs | 19.1 | 25.15 | | runway | 67.95 | 82.64 | | case | 41.18 | 61.73 | | pool table | 76.82 | 83.65 | | pillow | 42.63 | 54.39 | | screen door | 55.46 | 70.6 | | stairway | 22.55 | 34.33 | | river | 7.74 | 15.25 | | bridge | 52.27 | 65.12 | | bookcase | 24.86 | 31.88 | | blind | 35.42 | 42.38 | | coffee table | 47.38 | 64.92 | | toilet | 64.75 | 75.14 | | flower | 24.34 | 37.77 | | book | 31.47 | 49.59 | | hill | 7.07 | 9.77 | | bench | 31.04 | 41.65 | | countertop | 47.63 | 63.23 | | stove | 54.89 | 65.26 | | palm | 34.03 | 50.02 | | kitchen island | 24.88 | 42.92 | | computer | 44.29 | 54.81 | | swivel chair | 30.89 | 41.39 | | boat | 36.88 | 52.23 | | bar | 24.42 | 30.91 | | arcade machine | 24.56 | 28.67 | | hovel | 26.41 | 30.35 | | bus | 77.59 | 85.4 | | towel | 40.08 | 53.59 | | light | 9.93 | 10.8 | | truck | 20.5 | 32.79 | | tower | 26.32 | 41.24 | | chandelier | 48.62 | 65.21 | | awning | 13.49 | 17.62 | | streetlight | 4.83 | 5.61 | | booth | 40.38 | 44.63 | | television receiver | 47.14 | 65.18 | | airplane | 35.55 | 47.89 | | dirt track | 10.66 | 52.4 | | apparel | 31.95 | 47.02 | | pole | 3.99 | 4.77 | | land | 0.15 | 0.32 | | bannister | 1.94 | 2.38 | | escalator | 45.89 | 76.76 | | ottoman | 25.18 | 37.58 | | bottle | 22.4 | 37.42 | | buffet | 35.64 | 37.68 | | poster | 18.91 | 25.16 | | stage | 8.44 | 16.07 | | van | 38.49 | 49.01 | | ship | 52.97 | 70.81 | | fountain | 16.99 | 17.54 | | conveyer belt | 51.62 | 78.38 | | canopy | 11.76 | 14.53 | | washer | 51.75 | 54.43 | | plaything | 7.29 | 13.93 | | swimming pool | 75.4 | 81.81 | | stool | 21.92 | 29.59 | | barrel | 37.14 | 64.4 | | basket | 12.39 | 18.6 | | waterfall | 53.64 | 84.82 | | tent | 88.72 | 94.85 | | bag | 5.04 | 6.3 | | minibike | 47.0 | 58.27 | | cradle | 70.71 | 87.93 | | oven | 41.84 | 54.76 | | ball | 27.79 | 34.92 | | food | 41.0 | 51.72 | | step | 9.32 | 11.37 | | tank | 39.75 | 43.57 | | trade name | 9.38 | 9.97 | | microwave | 37.34 | 42.91 | | pot | 22.96 | 28.52 | | animal | 43.5 | 49.48 | | bicycle | 28.25 | 48.21 | | lake | 16.28 | 30.79 | | dishwasher | 45.22 | 53.22 | | screen | 51.87 | 69.3 | | blanket | 9.72 | 11.09 | | sculpture | 33.09 | 43.77 | | hood | 41.74 | 45.88 | | sconce | 15.08 | 17.52 | | vase | 15.35 | 21.17 | | traffic light | 10.91 | 17.17 | | tray | 0.99 | 3.0 | | ashcan | 19.26 | 24.6 | | fan | 28.99 | 38.14 | | pier | 12.92 | 15.9 | | crt screen | 0.19 | 0.43 | | plate | 22.8 | 29.53 | | monitor | 1.9 | 2.36 | | bulletin board | 30.26 | 33.35 | | shower | 2.11 | 4.55 | | radiator | 33.81 | 38.03 | | glass | 1.05 | 1.09 | | clock | 17.95 | 27.02 | | flag | 17.52 | 19.23 | +---------------------+-------+-------+ 2024/10/26 08:11:50 - mmengine - INFO - Iter(val) [500/500] aAcc: 75.9200 mIoU: 36.2300 mAcc: 47.2300 data_time: 0.0018 time: 0.0353 2024/10/26 08:13:28 - mmengine - INFO - Iter(train) [64050/80000] base_lr: 4.1240e-05 lr: 4.1240e-05 eta: 8:40:17 time: 1.9525 data_time: 0.0187 memory: 6646 grad_norm: 4.6298 loss: 0.4792 decode.loss_ce: 0.3169 decode.acc_seg: 83.3377 aux.loss_ce: 0.1622 aux.acc_seg: 84.5526 2024/10/26 08:15:06 - mmengine - INFO - Iter(train) [64100/80000] base_lr: 4.1016e-05 lr: 4.1016e-05 eta: 8:38:39 time: 1.9494 data_time: 0.0197 memory: 6645 grad_norm: 4.2449 loss: 0.4688 decode.loss_ce: 0.2987 decode.acc_seg: 86.3131 aux.loss_ce: 0.1701 aux.acc_seg: 85.3370 2024/10/26 08:16:43 - mmengine - INFO - Iter(train) [64150/80000] base_lr: 4.0792e-05 lr: 4.0792e-05 eta: 8:37:01 time: 1.9488 data_time: 0.0185 memory: 6645 grad_norm: 3.6154 loss: 0.4981 decode.loss_ce: 0.3271 decode.acc_seg: 86.2541 aux.loss_ce: 0.1709 aux.acc_seg: 83.6936 2024/10/26 08:18:21 - mmengine - INFO - Iter(train) [64200/80000] base_lr: 4.0569e-05 lr: 4.0569e-05 eta: 8:35:23 time: 1.9642 data_time: 0.0178 memory: 6645 grad_norm: 4.9923 loss: 0.4832 decode.loss_ce: 0.3150 decode.acc_seg: 82.3701 aux.loss_ce: 0.1681 aux.acc_seg: 73.2227 2024/10/26 08:19:59 - mmengine - INFO - Iter(train) [64250/80000] base_lr: 4.0347e-05 lr: 4.0347e-05 eta: 8:33:45 time: 1.9575 data_time: 0.0195 memory: 6645 grad_norm: 5.0349 loss: 0.4876 decode.loss_ce: 0.3058 decode.acc_seg: 84.7282 aux.loss_ce: 0.1819 aux.acc_seg: 79.2864 2024/10/26 08:21:37 - mmengine - INFO - Iter(train) [64300/80000] base_lr: 4.0124e-05 lr: 4.0124e-05 eta: 8:32:07 time: 1.9484 data_time: 0.0206 memory: 6645 grad_norm: 5.3281 loss: 0.4790 decode.loss_ce: 0.3217 decode.acc_seg: 91.9365 aux.loss_ce: 0.1573 aux.acc_seg: 87.4157 2024/10/26 08:23:14 - mmengine - INFO - Iter(train) [64350/80000] base_lr: 3.9902e-05 lr: 3.9902e-05 eta: 8:30:29 time: 1.9529 data_time: 0.0204 memory: 6646 grad_norm: 4.5994 loss: 0.4101 decode.loss_ce: 0.2671 decode.acc_seg: 89.8073 aux.loss_ce: 0.1430 aux.acc_seg: 87.7267 2024/10/26 08:24:52 - mmengine - INFO - Iter(train) [64400/80000] base_lr: 3.9680e-05 lr: 3.9680e-05 eta: 8:28:52 time: 1.9522 data_time: 0.0189 memory: 6646 grad_norm: 3.2693 loss: 0.4703 decode.loss_ce: 0.3217 decode.acc_seg: 88.1403 aux.loss_ce: 0.1486 aux.acc_seg: 87.6969 2024/10/26 08:26:30 - mmengine - INFO - Iter(train) [64450/80000] base_lr: 3.9459e-05 lr: 3.9459e-05 eta: 8:27:14 time: 1.9597 data_time: 0.0180 memory: 6645 grad_norm: 4.1344 loss: 0.5479 decode.loss_ce: 0.3517 decode.acc_seg: 82.4936 aux.loss_ce: 0.1963 aux.acc_seg: 83.5984 2024/10/26 08:28:08 - mmengine - INFO - Iter(train) [64500/80000] base_lr: 3.9237e-05 lr: 3.9237e-05 eta: 8:25:36 time: 1.9554 data_time: 0.0196 memory: 6645 grad_norm: 4.2661 loss: 0.5352 decode.loss_ce: 0.3609 decode.acc_seg: 82.7624 aux.loss_ce: 0.1743 aux.acc_seg: 80.2281 2024/10/26 08:29:46 - mmengine - INFO - Iter(train) [64550/80000] base_lr: 3.9016e-05 lr: 3.9016e-05 eta: 8:23:58 time: 1.9456 data_time: 0.0191 memory: 6645 grad_norm: 3.6951 loss: 0.4298 decode.loss_ce: 0.2894 decode.acc_seg: 90.1306 aux.loss_ce: 0.1404 aux.acc_seg: 90.5963 2024/10/26 08:31:24 - mmengine - INFO - Iter(train) [64600/80000] base_lr: 3.8796e-05 lr: 3.8796e-05 eta: 8:22:20 time: 1.9597 data_time: 0.0179 memory: 6645 grad_norm: 3.5231 loss: 0.4713 decode.loss_ce: 0.3163 decode.acc_seg: 92.2095 aux.loss_ce: 0.1550 aux.acc_seg: 91.9512 2024/10/26 08:33:02 - mmengine - INFO - Iter(train) [64650/80000] base_lr: 3.8576e-05 lr: 3.8576e-05 eta: 8:20:42 time: 1.9540 data_time: 0.0178 memory: 6645 grad_norm: 4.2767 loss: 0.4777 decode.loss_ce: 0.3190 decode.acc_seg: 89.0020 aux.loss_ce: 0.1586 aux.acc_seg: 86.8706 2024/10/26 08:34:39 - mmengine - INFO - Iter(train) [64700/80000] base_lr: 3.8356e-05 lr: 3.8356e-05 eta: 8:19:04 time: 1.9489 data_time: 0.0196 memory: 6645 grad_norm: 3.9564 loss: 0.4493 decode.loss_ce: 0.2981 decode.acc_seg: 85.6450 aux.loss_ce: 0.1512 aux.acc_seg: 89.4645 2024/10/26 08:36:17 - mmengine - INFO - Iter(train) [64750/80000] base_lr: 3.8136e-05 lr: 3.8136e-05 eta: 8:17:26 time: 1.9646 data_time: 0.0174 memory: 6645 grad_norm: 4.8619 loss: 0.4489 decode.loss_ce: 0.2979 decode.acc_seg: 88.0628 aux.loss_ce: 0.1510 aux.acc_seg: 86.7009 2024/10/26 08:37:54 - mmengine - INFO - Iter(train) [64800/80000] base_lr: 3.7917e-05 lr: 3.7917e-05 eta: 8:15:48 time: 1.9547 data_time: 0.0176 memory: 6645 grad_norm: 4.6429 loss: 0.4611 decode.loss_ce: 0.3024 decode.acc_seg: 89.7834 aux.loss_ce: 0.1587 aux.acc_seg: 89.0757 2024/10/26 08:39:32 - mmengine - INFO - Iter(train) [64850/80000] base_lr: 3.7698e-05 lr: 3.7698e-05 eta: 8:14:11 time: 1.9612 data_time: 0.0175 memory: 6645 grad_norm: 4.7157 loss: 0.5365 decode.loss_ce: 0.3647 decode.acc_seg: 76.1643 aux.loss_ce: 0.1718 aux.acc_seg: 77.0224 2024/10/26 08:41:10 - mmengine - INFO - Iter(train) [64900/80000] base_lr: 3.7479e-05 lr: 3.7479e-05 eta: 8:12:33 time: 1.9663 data_time: 0.0192 memory: 6645 grad_norm: 4.2282 loss: 0.5214 decode.loss_ce: 0.3451 decode.acc_seg: 84.5433 aux.loss_ce: 0.1763 aux.acc_seg: 80.1577 2024/10/26 08:42:48 - mmengine - INFO - Iter(train) [64950/80000] base_lr: 3.7261e-05 lr: 3.7261e-05 eta: 8:10:55 time: 1.9498 data_time: 0.0196 memory: 6646 grad_norm: 4.5791 loss: 0.4644 decode.loss_ce: 0.3020 decode.acc_seg: 81.5236 aux.loss_ce: 0.1624 aux.acc_seg: 82.0355 2024/10/26 08:44:26 - mmengine - INFO - Exp name: deeplabv3_mobilemamba_b4-80k_ade20k-512x512_20241024_212005 2024/10/26 08:44:26 - mmengine - INFO - Iter(train) [65000/80000] base_lr: 3.7043e-05 lr: 3.7043e-05 eta: 8:09:17 time: 1.9746 data_time: 0.0180 memory: 6645 grad_norm: 4.2122 loss: 0.4638 decode.loss_ce: 0.3091 decode.acc_seg: 90.9913 aux.loss_ce: 0.1547 aux.acc_seg: 86.8358 2024/10/26 08:46:04 - mmengine - INFO - Iter(train) [65050/80000] base_lr: 3.6826e-05 lr: 3.6826e-05 eta: 8:07:39 time: 1.9593 data_time: 0.0189 memory: 6646 grad_norm: 3.7348 loss: 0.4174 decode.loss_ce: 0.2794 decode.acc_seg: 87.4884 aux.loss_ce: 0.1380 aux.acc_seg: 88.2655 2024/10/26 08:47:43 - mmengine - INFO - Iter(train) [65100/80000] base_lr: 3.6609e-05 lr: 3.6609e-05 eta: 8:06:02 time: 1.9538 data_time: 0.0173 memory: 6645 grad_norm: 3.5237 loss: 0.4098 decode.loss_ce: 0.2672 decode.acc_seg: 89.7387 aux.loss_ce: 0.1426 aux.acc_seg: 90.7737 2024/10/26 08:49:21 - mmengine - INFO - Iter(train) [65150/80000] base_lr: 3.6392e-05 lr: 3.6392e-05 eta: 8:04:24 time: 1.9565 data_time: 0.0187 memory: 6646 grad_norm: 2.9497 loss: 0.5119 decode.loss_ce: 0.3304 decode.acc_seg: 85.6319 aux.loss_ce: 0.1815 aux.acc_seg: 83.3125 2024/10/26 08:50:58 - mmengine - INFO - Iter(train) [65200/80000] base_lr: 3.6175e-05 lr: 3.6175e-05 eta: 8:02:46 time: 1.9560 data_time: 0.0202 memory: 6646 grad_norm: 3.8554 loss: 0.5638 decode.loss_ce: 0.3688 decode.acc_seg: 86.6698 aux.loss_ce: 0.1950 aux.acc_seg: 84.6230 2024/10/26 08:52:36 - mmengine - INFO - Iter(train) [65250/80000] base_lr: 3.5959e-05 lr: 3.5959e-05 eta: 8:01:08 time: 1.9498 data_time: 0.0180 memory: 6645 grad_norm: 4.2388 loss: 0.4882 decode.loss_ce: 0.3353 decode.acc_seg: 79.4528 aux.loss_ce: 0.1529 aux.acc_seg: 76.8810 2024/10/26 08:54:14 - mmengine - INFO - Iter(train) [65300/80000] base_lr: 3.5744e-05 lr: 3.5744e-05 eta: 7:59:30 time: 1.9675 data_time: 0.0173 memory: 6646 grad_norm: 2.8250 loss: 0.4672 decode.loss_ce: 0.3114 decode.acc_seg: 86.3142 aux.loss_ce: 0.1558 aux.acc_seg: 85.4297 2024/10/26 08:55:52 - mmengine - INFO - Iter(train) [65350/80000] base_lr: 3.5528e-05 lr: 3.5528e-05 eta: 7:57:52 time: 1.9503 data_time: 0.0192 memory: 6646 grad_norm: 4.6738 loss: 0.4050 decode.loss_ce: 0.2714 decode.acc_seg: 89.7928 aux.loss_ce: 0.1335 aux.acc_seg: 87.2766 2024/10/26 08:57:30 - mmengine - INFO - Iter(train) [65400/80000] base_lr: 3.5313e-05 lr: 3.5313e-05 eta: 7:56:14 time: 1.9571 data_time: 0.0187 memory: 6647 grad_norm: 5.0194 loss: 0.4598 decode.loss_ce: 0.2986 decode.acc_seg: 86.1112 aux.loss_ce: 0.1612 aux.acc_seg: 86.0777 2024/10/26 08:59:07 - mmengine - INFO - Iter(train) [65450/80000] base_lr: 3.5099e-05 lr: 3.5099e-05 eta: 7:54:36 time: 1.9480 data_time: 0.0189 memory: 6645 grad_norm: 3.2921 loss: 0.5715 decode.loss_ce: 0.3823 decode.acc_seg: 81.1025 aux.loss_ce: 0.1892 aux.acc_seg: 77.0143 2024/10/26 09:00:45 - mmengine - INFO - Iter(train) [65500/80000] base_lr: 3.4885e-05 lr: 3.4885e-05 eta: 7:52:58 time: 1.9502 data_time: 0.0191 memory: 6646 grad_norm: 3.1645 loss: 0.4588 decode.loss_ce: 0.3120 decode.acc_seg: 86.5905 aux.loss_ce: 0.1468 aux.acc_seg: 89.0053 2024/10/26 09:02:22 - mmengine - INFO - Iter(train) [65550/80000] base_lr: 3.4671e-05 lr: 3.4671e-05 eta: 7:51:21 time: 1.9540 data_time: 0.0198 memory: 6646 grad_norm: 4.7482 loss: 0.5101 decode.loss_ce: 0.3295 decode.acc_seg: 87.0210 aux.loss_ce: 0.1806 aux.acc_seg: 79.3928 2024/10/26 09:04:00 - mmengine - INFO - Iter(train) [65600/80000] base_lr: 3.4458e-05 lr: 3.4458e-05 eta: 7:49:43 time: 1.9575 data_time: 0.0186 memory: 6645 grad_norm: 5.7874 loss: 0.4637 decode.loss_ce: 0.3146 decode.acc_seg: 87.9149 aux.loss_ce: 0.1491 aux.acc_seg: 87.6669 2024/10/26 09:05:38 - mmengine - INFO - Iter(train) [65650/80000] base_lr: 3.4245e-05 lr: 3.4245e-05 eta: 7:48:05 time: 1.9532 data_time: 0.0211 memory: 6646 grad_norm: 3.9519 loss: 0.5504 decode.loss_ce: 0.3687 decode.acc_seg: 88.0263 aux.loss_ce: 0.1817 aux.acc_seg: 86.8557 2024/10/26 09:07:16 - mmengine - INFO - Iter(train) [65700/80000] base_lr: 3.4032e-05 lr: 3.4032e-05 eta: 7:46:27 time: 1.9523 data_time: 0.0185 memory: 6645 grad_norm: 3.9319 loss: 0.4845 decode.loss_ce: 0.3188 decode.acc_seg: 88.3671 aux.loss_ce: 0.1657 aux.acc_seg: 88.1888 2024/10/26 09:08:53 - mmengine - INFO - Iter(train) [65750/80000] base_lr: 3.3820e-05 lr: 3.3820e-05 eta: 7:44:49 time: 1.9519 data_time: 0.0183 memory: 6646 grad_norm: 3.6513 loss: 0.4631 decode.loss_ce: 0.3106 decode.acc_seg: 89.0927 aux.loss_ce: 0.1525 aux.acc_seg: 86.9702 2024/10/26 09:10:31 - mmengine - INFO - Iter(train) [65800/80000] base_lr: 3.3608e-05 lr: 3.3608e-05 eta: 7:43:11 time: 1.9537 data_time: 0.0180 memory: 6644 grad_norm: 3.8315 loss: 0.4238 decode.loss_ce: 0.2862 decode.acc_seg: 88.6679 aux.loss_ce: 0.1376 aux.acc_seg: 88.1216 2024/10/26 09:12:09 - mmengine - INFO - Iter(train) [65850/80000] base_lr: 3.3396e-05 lr: 3.3396e-05 eta: 7:41:33 time: 1.9736 data_time: 0.0190 memory: 6645 grad_norm: 3.2442 loss: 0.4794 decode.loss_ce: 0.3258 decode.acc_seg: 88.6500 aux.loss_ce: 0.1536 aux.acc_seg: 89.4986 2024/10/26 09:13:47 - mmengine - INFO - Iter(train) [65900/80000] base_lr: 3.3185e-05 lr: 3.3185e-05 eta: 7:39:55 time: 1.9487 data_time: 0.0189 memory: 6648 grad_norm: 6.5703 loss: 0.4502 decode.loss_ce: 0.3030 decode.acc_seg: 88.7030 aux.loss_ce: 0.1472 aux.acc_seg: 87.4114 2024/10/26 09:15:24 - mmengine - INFO - Iter(train) [65950/80000] base_lr: 3.2975e-05 lr: 3.2975e-05 eta: 7:38:17 time: 1.9460 data_time: 0.0181 memory: 6645 grad_norm: 4.1990 loss: 0.4394 decode.loss_ce: 0.2917 decode.acc_seg: 90.1714 aux.loss_ce: 0.1477 aux.acc_seg: 87.4782 2024/10/26 09:17:02 - mmengine - INFO - Exp name: deeplabv3_mobilemamba_b4-80k_ade20k-512x512_20241024_212005 2024/10/26 09:17:02 - mmengine - INFO - Iter(train) [66000/80000] base_lr: 3.2765e-05 lr: 3.2765e-05 eta: 7:36:40 time: 1.9503 data_time: 0.0194 memory: 6646 grad_norm: 4.1770 loss: 0.4994 decode.loss_ce: 0.3292 decode.acc_seg: 84.7205 aux.loss_ce: 0.1701 aux.acc_seg: 80.7481 2024/10/26 09:18:44 - mmengine - INFO - Iter(train) [66050/80000] base_lr: 3.2555e-05 lr: 3.2555e-05 eta: 7:35:03 time: 1.9503 data_time: 0.0177 memory: 6647 grad_norm: 3.5013 loss: 0.4826 decode.loss_ce: 0.3181 decode.acc_seg: 88.7332 aux.loss_ce: 0.1645 aux.acc_seg: 83.3885 2024/10/26 09:20:22 - mmengine - INFO - Iter(train) [66100/80000] base_lr: 3.2346e-05 lr: 3.2346e-05 eta: 7:33:25 time: 1.9669 data_time: 0.0167 memory: 6646 grad_norm: 5.0607 loss: 0.5241 decode.loss_ce: 0.3325 decode.acc_seg: 86.6988 aux.loss_ce: 0.1917 aux.acc_seg: 86.3842 2024/10/26 09:21:59 - mmengine - INFO - Iter(train) [66150/80000] base_lr: 3.2137e-05 lr: 3.2137e-05 eta: 7:31:47 time: 1.9559 data_time: 0.0199 memory: 6646 grad_norm: 4.5830 loss: 0.5064 decode.loss_ce: 0.3329 decode.acc_seg: 91.4913 aux.loss_ce: 0.1734 aux.acc_seg: 91.7859 2024/10/26 09:23:38 - mmengine - INFO - Iter(train) [66200/80000] base_lr: 3.1928e-05 lr: 3.1928e-05 eta: 7:30:09 time: 1.9464 data_time: 0.0178 memory: 6645 grad_norm: 3.8289 loss: 0.4665 decode.loss_ce: 0.3144 decode.acc_seg: 87.9215 aux.loss_ce: 0.1522 aux.acc_seg: 84.7689 2024/10/26 09:25:15 - mmengine - INFO - Iter(train) [66250/80000] base_lr: 3.1720e-05 lr: 3.1720e-05 eta: 7:28:31 time: 1.9465 data_time: 0.0191 memory: 6646 grad_norm: 4.5029 loss: 0.5249 decode.loss_ce: 0.3533 decode.acc_seg: 87.7950 aux.loss_ce: 0.1716 aux.acc_seg: 86.4899 2024/10/26 09:26:53 - mmengine - INFO - Iter(train) [66300/80000] base_lr: 3.1513e-05 lr: 3.1513e-05 eta: 7:26:53 time: 1.9505 data_time: 0.0185 memory: 6646 grad_norm: 4.2866 loss: 0.4429 decode.loss_ce: 0.2933 decode.acc_seg: 91.4026 aux.loss_ce: 0.1496 aux.acc_seg: 85.1562 2024/10/26 09:28:31 - mmengine - INFO - Iter(train) [66350/80000] base_lr: 3.1306e-05 lr: 3.1306e-05 eta: 7:25:15 time: 1.9562 data_time: 0.0183 memory: 6645 grad_norm: 3.8623 loss: 0.4981 decode.loss_ce: 0.3225 decode.acc_seg: 91.2232 aux.loss_ce: 0.1755 aux.acc_seg: 88.9289 2024/10/26 09:30:08 - mmengine - INFO - Iter(train) [66400/80000] base_lr: 3.1099e-05 lr: 3.1099e-05 eta: 7:23:37 time: 1.9485 data_time: 0.0191 memory: 6646 grad_norm: 5.2496 loss: 0.4075 decode.loss_ce: 0.2724 decode.acc_seg: 93.4535 aux.loss_ce: 0.1350 aux.acc_seg: 89.1002 2024/10/26 09:31:46 - mmengine - INFO - Iter(train) [66450/80000] base_lr: 3.0893e-05 lr: 3.0893e-05 eta: 7:22:00 time: 1.9809 data_time: 0.0175 memory: 6648 grad_norm: 4.1015 loss: 0.5582 decode.loss_ce: 0.3603 decode.acc_seg: 82.5668 aux.loss_ce: 0.1979 aux.acc_seg: 81.3137 2024/10/26 09:33:24 - mmengine - INFO - Iter(train) [66500/80000] base_lr: 3.0687e-05 lr: 3.0687e-05 eta: 7:20:22 time: 1.9452 data_time: 0.0185 memory: 6646 grad_norm: 3.5825 loss: 0.4761 decode.loss_ce: 0.3034 decode.acc_seg: 89.5957 aux.loss_ce: 0.1727 aux.acc_seg: 78.5790 2024/10/26 09:35:02 - mmengine - INFO - Iter(train) [66550/80000] base_lr: 3.0481e-05 lr: 3.0481e-05 eta: 7:18:44 time: 1.9443 data_time: 0.0194 memory: 6646 grad_norm: 4.5718 loss: 0.4544 decode.loss_ce: 0.2971 decode.acc_seg: 83.9232 aux.loss_ce: 0.1573 aux.acc_seg: 77.7674 2024/10/26 09:36:44 - mmengine - INFO - Iter(train) [66600/80000] base_lr: 3.0277e-05 lr: 3.0277e-05 eta: 7:17:07 time: 1.9483 data_time: 0.0187 memory: 6645 grad_norm: 3.8742 loss: 0.5376 decode.loss_ce: 0.3569 decode.acc_seg: 83.9811 aux.loss_ce: 0.1807 aux.acc_seg: 86.4826 2024/10/26 09:38:21 - mmengine - INFO - Iter(train) [66650/80000] base_lr: 3.0072e-05 lr: 3.0072e-05 eta: 7:15:29 time: 1.9607 data_time: 0.0188 memory: 6644 grad_norm: 3.1817 loss: 0.4441 decode.loss_ce: 0.2927 decode.acc_seg: 90.2602 aux.loss_ce: 0.1514 aux.acc_seg: 87.4648 2024/10/26 09:39:59 - mmengine - INFO - Iter(train) [66700/80000] base_lr: 2.9868e-05 lr: 2.9868e-05 eta: 7:13:51 time: 1.9521 data_time: 0.0186 memory: 6645 grad_norm: 3.9061 loss: 0.4821 decode.loss_ce: 0.3204 decode.acc_seg: 88.5513 aux.loss_ce: 0.1617 aux.acc_seg: 88.3667 2024/10/26 09:41:37 - mmengine - INFO - Iter(train) [66750/80000] base_lr: 2.9665e-05 lr: 2.9665e-05 eta: 7:12:13 time: 1.9510 data_time: 0.0182 memory: 6647 grad_norm: 6.0591 loss: 0.4407 decode.loss_ce: 0.3016 decode.acc_seg: 88.0020 aux.loss_ce: 0.1392 aux.acc_seg: 85.5000 2024/10/26 09:43:14 - mmengine - INFO - Iter(train) [66800/80000] base_lr: 2.9462e-05 lr: 2.9462e-05 eta: 7:10:35 time: 1.9506 data_time: 0.0187 memory: 6645 grad_norm: 3.7842 loss: 0.4284 decode.loss_ce: 0.2818 decode.acc_seg: 87.4842 aux.loss_ce: 0.1466 aux.acc_seg: 87.3279 2024/10/26 09:44:52 - mmengine - INFO - Iter(train) [66850/80000] base_lr: 2.9259e-05 lr: 2.9259e-05 eta: 7:08:57 time: 1.9401 data_time: 0.0194 memory: 6646 grad_norm: 3.1308 loss: 0.4000 decode.loss_ce: 0.2736 decode.acc_seg: 86.1284 aux.loss_ce: 0.1264 aux.acc_seg: 83.6328 2024/10/26 09:46:30 - mmengine - INFO - Iter(train) [66900/80000] base_lr: 2.9057e-05 lr: 2.9057e-05 eta: 7:07:19 time: 1.9474 data_time: 0.0193 memory: 6646 grad_norm: 4.0011 loss: 0.5005 decode.loss_ce: 0.3386 decode.acc_seg: 88.1373 aux.loss_ce: 0.1619 aux.acc_seg: 84.8354 2024/10/26 09:48:08 - mmengine - INFO - Iter(train) [66950/80000] base_lr: 2.8855e-05 lr: 2.8855e-05 eta: 7:05:41 time: 1.9476 data_time: 0.0199 memory: 6645 grad_norm: 3.3980 loss: 0.4771 decode.loss_ce: 0.3269 decode.acc_seg: 85.3323 aux.loss_ce: 0.1502 aux.acc_seg: 86.5110 2024/10/26 09:49:45 - mmengine - INFO - Exp name: deeplabv3_mobilemamba_b4-80k_ade20k-512x512_20241024_212005 2024/10/26 09:49:45 - mmengine - INFO - Iter(train) [67000/80000] base_lr: 2.8654e-05 lr: 2.8654e-05 eta: 7:04:04 time: 1.9487 data_time: 0.0179 memory: 6645 grad_norm: 3.5732 loss: 0.4293 decode.loss_ce: 0.2821 decode.acc_seg: 85.0824 aux.loss_ce: 0.1472 aux.acc_seg: 86.4011 2024/10/26 09:51:23 - mmengine - INFO - Iter(train) [67050/80000] base_lr: 2.8453e-05 lr: 2.8453e-05 eta: 7:02:26 time: 1.9542 data_time: 0.0174 memory: 6645 grad_norm: 7.2969 loss: 0.5502 decode.loss_ce: 0.3648 decode.acc_seg: 82.8873 aux.loss_ce: 0.1854 aux.acc_seg: 76.8429 2024/10/26 09:53:01 - mmengine - INFO - Iter(train) [67100/80000] base_lr: 2.8253e-05 lr: 2.8253e-05 eta: 7:00:48 time: 1.9485 data_time: 0.0187 memory: 6645 grad_norm: 3.8031 loss: 0.3987 decode.loss_ce: 0.2726 decode.acc_seg: 89.7453 aux.loss_ce: 0.1261 aux.acc_seg: 89.8135 2024/10/26 09:54:38 - mmengine - INFO - Iter(train) [67150/80000] base_lr: 2.8054e-05 lr: 2.8054e-05 eta: 6:59:10 time: 1.9555 data_time: 0.0197 memory: 6646 grad_norm: 3.2288 loss: 0.4405 decode.loss_ce: 0.3013 decode.acc_seg: 83.2624 aux.loss_ce: 0.1392 aux.acc_seg: 81.2742 2024/10/26 09:56:16 - mmengine - INFO - Iter(train) [67200/80000] base_lr: 2.7854e-05 lr: 2.7854e-05 eta: 6:57:32 time: 1.9649 data_time: 0.0173 memory: 6645 grad_norm: 5.8030 loss: 0.4253 decode.loss_ce: 0.2760 decode.acc_seg: 88.2085 aux.loss_ce: 0.1493 aux.acc_seg: 83.7622 2024/10/26 09:57:54 - mmengine - INFO - Iter(train) [67250/80000] base_lr: 2.7656e-05 lr: 2.7656e-05 eta: 6:55:54 time: 1.9476 data_time: 0.0193 memory: 6646 grad_norm: 3.4223 loss: 0.3996 decode.loss_ce: 0.2696 decode.acc_seg: 90.6127 aux.loss_ce: 0.1300 aux.acc_seg: 90.1542 2024/10/26 09:59:32 - mmengine - INFO - Iter(train) [67300/80000] base_lr: 2.7457e-05 lr: 2.7457e-05 eta: 6:54:16 time: 1.9524 data_time: 0.0184 memory: 6645 grad_norm: 3.2836 loss: 0.4447 decode.loss_ce: 0.2922 decode.acc_seg: 89.6861 aux.loss_ce: 0.1525 aux.acc_seg: 83.4465 2024/10/26 10:01:09 - mmengine - INFO - Iter(train) [67350/80000] base_lr: 2.7260e-05 lr: 2.7260e-05 eta: 6:52:38 time: 1.9553 data_time: 0.0195 memory: 6645 grad_norm: 5.8508 loss: 0.5787 decode.loss_ce: 0.3781 decode.acc_seg: 86.7461 aux.loss_ce: 0.2006 aux.acc_seg: 77.9595 2024/10/26 10:02:47 - mmengine - INFO - Iter(train) [67400/80000] base_lr: 2.7063e-05 lr: 2.7063e-05 eta: 6:51:00 time: 1.9582 data_time: 0.0178 memory: 6647 grad_norm: 3.6549 loss: 0.4519 decode.loss_ce: 0.2947 decode.acc_seg: 82.6328 aux.loss_ce: 0.1572 aux.acc_seg: 78.1525 2024/10/26 10:04:25 - mmengine - INFO - Iter(train) [67450/80000] base_lr: 2.6866e-05 lr: 2.6866e-05 eta: 6:49:23 time: 1.9517 data_time: 0.0212 memory: 6645 grad_norm: 5.2292 loss: 0.5343 decode.loss_ce: 0.3454 decode.acc_seg: 79.5598 aux.loss_ce: 0.1888 aux.acc_seg: 74.6580 2024/10/26 10:06:02 - mmengine - INFO - Iter(train) [67500/80000] base_lr: 2.6670e-05 lr: 2.6670e-05 eta: 6:47:45 time: 1.9549 data_time: 0.0183 memory: 6645 grad_norm: 3.3458 loss: 0.4641 decode.loss_ce: 0.3085 decode.acc_seg: 87.5056 aux.loss_ce: 0.1555 aux.acc_seg: 87.0719 2024/10/26 10:07:42 - mmengine - INFO - Iter(train) [67550/80000] base_lr: 2.6474e-05 lr: 2.6474e-05 eta: 6:46:07 time: 1.9569 data_time: 0.0188 memory: 6646 grad_norm: 5.3712 loss: 0.4444 decode.loss_ce: 0.2943 decode.acc_seg: 89.3132 aux.loss_ce: 0.1501 aux.acc_seg: 87.4409 2024/10/26 10:09:20 - mmengine - INFO - Iter(train) [67600/80000] base_lr: 2.6279e-05 lr: 2.6279e-05 eta: 6:44:29 time: 1.9590 data_time: 0.0194 memory: 6646 grad_norm: 3.7488 loss: 0.4448 decode.loss_ce: 0.2941 decode.acc_seg: 88.4023 aux.loss_ce: 0.1506 aux.acc_seg: 86.7719 2024/10/26 10:10:58 - mmengine - INFO - Iter(train) [67650/80000] base_lr: 2.6084e-05 lr: 2.6084e-05 eta: 6:42:51 time: 1.9486 data_time: 0.0199 memory: 6645 grad_norm: 4.2654 loss: 0.4278 decode.loss_ce: 0.2745 decode.acc_seg: 88.2943 aux.loss_ce: 0.1533 aux.acc_seg: 86.1698 2024/10/26 10:12:35 - mmengine - INFO - Iter(train) [67700/80000] base_lr: 2.5890e-05 lr: 2.5890e-05 eta: 6:41:13 time: 1.9523 data_time: 0.0190 memory: 6646 grad_norm: 4.9775 loss: 0.5193 decode.loss_ce: 0.3428 decode.acc_seg: 82.2431 aux.loss_ce: 0.1765 aux.acc_seg: 77.8303 2024/10/26 10:14:13 - mmengine - INFO - Iter(train) [67750/80000] base_lr: 2.5697e-05 lr: 2.5697e-05 eta: 6:39:36 time: 1.9490 data_time: 0.0187 memory: 6645 grad_norm: 3.6832 loss: 0.4129 decode.loss_ce: 0.2729 decode.acc_seg: 89.0980 aux.loss_ce: 0.1400 aux.acc_seg: 85.7648 2024/10/26 10:15:50 - mmengine - INFO - Iter(train) [67800/80000] base_lr: 2.5504e-05 lr: 2.5504e-05 eta: 6:37:58 time: 1.9500 data_time: 0.0190 memory: 6646 grad_norm: 3.9533 loss: 0.4619 decode.loss_ce: 0.3051 decode.acc_seg: 89.5958 aux.loss_ce: 0.1567 aux.acc_seg: 89.6223 2024/10/26 10:17:29 - mmengine - INFO - Iter(train) [67850/80000] base_lr: 2.5311e-05 lr: 2.5311e-05 eta: 6:36:20 time: 1.9502 data_time: 0.0192 memory: 6647 grad_norm: 4.5004 loss: 0.4444 decode.loss_ce: 0.3025 decode.acc_seg: 86.4452 aux.loss_ce: 0.1419 aux.acc_seg: 85.8617 2024/10/26 10:19:06 - mmengine - INFO - Iter(train) [67900/80000] base_lr: 2.5119e-05 lr: 2.5119e-05 eta: 6:34:42 time: 1.9457 data_time: 0.0184 memory: 6646 grad_norm: 3.1043 loss: 0.4176 decode.loss_ce: 0.2785 decode.acc_seg: 89.5061 aux.loss_ce: 0.1391 aux.acc_seg: 89.5125 2024/10/26 10:20:44 - mmengine - INFO - Iter(train) [67950/80000] base_lr: 2.4928e-05 lr: 2.4928e-05 eta: 6:33:04 time: 1.9465 data_time: 0.0187 memory: 6646 grad_norm: 4.0571 loss: 0.4889 decode.loss_ce: 0.3142 decode.acc_seg: 86.3398 aux.loss_ce: 0.1747 aux.acc_seg: 83.4702 2024/10/26 10:22:22 - mmengine - INFO - Exp name: deeplabv3_mobilemamba_b4-80k_ade20k-512x512_20241024_212005 2024/10/26 10:22:22 - mmengine - INFO - Iter(train) [68000/80000] base_lr: 2.4737e-05 lr: 2.4737e-05 eta: 6:31:26 time: 1.9556 data_time: 0.0181 memory: 6645 grad_norm: 4.0196 loss: 0.4648 decode.loss_ce: 0.3119 decode.acc_seg: 87.2865 aux.loss_ce: 0.1529 aux.acc_seg: 85.9634 2024/10/26 10:24:00 - mmengine - INFO - Iter(train) [68050/80000] base_lr: 2.4546e-05 lr: 2.4546e-05 eta: 6:29:48 time: 1.9535 data_time: 0.0184 memory: 6647 grad_norm: 6.7228 loss: 0.5094 decode.loss_ce: 0.3346 decode.acc_seg: 86.7965 aux.loss_ce: 0.1748 aux.acc_seg: 85.1107 2024/10/26 10:25:38 - mmengine - INFO - Iter(train) [68100/80000] base_lr: 2.4357e-05 lr: 2.4357e-05 eta: 6:28:10 time: 1.9548 data_time: 0.0181 memory: 6646 grad_norm: 3.1655 loss: 0.5113 decode.loss_ce: 0.3340 decode.acc_seg: 86.0313 aux.loss_ce: 0.1773 aux.acc_seg: 82.4547 2024/10/26 10:27:15 - mmengine - INFO - Iter(train) [68150/80000] base_lr: 2.4167e-05 lr: 2.4167e-05 eta: 6:26:32 time: 1.9537 data_time: 0.0189 memory: 6646 grad_norm: 3.4785 loss: 0.4227 decode.loss_ce: 0.2683 decode.acc_seg: 93.5790 aux.loss_ce: 0.1543 aux.acc_seg: 92.6837 2024/10/26 10:28:53 - mmengine - INFO - Iter(train) [68200/80000] base_lr: 2.3979e-05 lr: 2.3979e-05 eta: 6:24:55 time: 1.9551 data_time: 0.0194 memory: 6645 grad_norm: 3.6085 loss: 0.4750 decode.loss_ce: 0.3159 decode.acc_seg: 86.9947 aux.loss_ce: 0.1590 aux.acc_seg: 83.9610 2024/10/26 10:30:31 - mmengine - INFO - Iter(train) [68250/80000] base_lr: 2.3790e-05 lr: 2.3790e-05 eta: 6:23:17 time: 1.9514 data_time: 0.0187 memory: 6645 grad_norm: 4.8687 loss: 0.4570 decode.loss_ce: 0.3086 decode.acc_seg: 84.0048 aux.loss_ce: 0.1484 aux.acc_seg: 84.5073 2024/10/26 10:32:09 - mmengine - INFO - Iter(train) [68300/80000] base_lr: 2.3603e-05 lr: 2.3603e-05 eta: 6:21:39 time: 1.9734 data_time: 0.0181 memory: 6646 grad_norm: 4.1280 loss: 0.4211 decode.loss_ce: 0.2809 decode.acc_seg: 87.5632 aux.loss_ce: 0.1402 aux.acc_seg: 85.6647 2024/10/26 10:33:47 - mmengine - INFO - Iter(train) [68350/80000] base_lr: 2.3416e-05 lr: 2.3416e-05 eta: 6:20:01 time: 1.9452 data_time: 0.0199 memory: 6645 grad_norm: 5.8809 loss: 0.4064 decode.loss_ce: 0.2666 decode.acc_seg: 88.4949 aux.loss_ce: 0.1398 aux.acc_seg: 88.9059 2024/10/26 10:35:24 - mmengine - INFO - Iter(train) [68400/80000] base_lr: 2.3229e-05 lr: 2.3229e-05 eta: 6:18:23 time: 1.9482 data_time: 0.0209 memory: 6645 grad_norm: 3.7781 loss: 0.4118 decode.loss_ce: 0.2723 decode.acc_seg: 89.6976 aux.loss_ce: 0.1395 aux.acc_seg: 89.2077 2024/10/26 10:37:02 - mmengine - INFO - Iter(train) [68450/80000] base_lr: 2.3043e-05 lr: 2.3043e-05 eta: 6:16:45 time: 1.9446 data_time: 0.0193 memory: 6645 grad_norm: 3.4197 loss: 0.4813 decode.loss_ce: 0.3223 decode.acc_seg: 91.1297 aux.loss_ce: 0.1590 aux.acc_seg: 89.8673 2024/10/26 10:38:43 - mmengine - INFO - Iter(train) [68500/80000] base_lr: 2.2858e-05 lr: 2.2858e-05 eta: 6:15:08 time: 1.9453 data_time: 0.0190 memory: 6646 grad_norm: 3.9017 loss: 0.4668 decode.loss_ce: 0.3146 decode.acc_seg: 90.4243 aux.loss_ce: 0.1522 aux.acc_seg: 86.4218 2024/10/26 10:40:21 - mmengine - INFO - Iter(train) [68550/80000] base_lr: 2.2673e-05 lr: 2.2673e-05 eta: 6:13:30 time: 1.9499 data_time: 0.0181 memory: 6645 grad_norm: 4.0859 loss: 0.4242 decode.loss_ce: 0.2754 decode.acc_seg: 91.6784 aux.loss_ce: 0.1488 aux.acc_seg: 88.4985 2024/10/26 10:42:00 - mmengine - INFO - Iter(train) [68600/80000] base_lr: 2.2489e-05 lr: 2.2489e-05 eta: 6:11:52 time: 1.9665 data_time: 0.0184 memory: 6646 grad_norm: 3.9605 loss: 0.4966 decode.loss_ce: 0.3324 decode.acc_seg: 88.3059 aux.loss_ce: 0.1642 aux.acc_seg: 85.9021 2024/10/26 10:43:37 - mmengine - INFO - Iter(train) [68650/80000] base_lr: 2.2306e-05 lr: 2.2306e-05 eta: 6:10:14 time: 1.9550 data_time: 0.0199 memory: 6647 grad_norm: 4.5890 loss: 0.4736 decode.loss_ce: 0.3077 decode.acc_seg: 90.0710 aux.loss_ce: 0.1659 aux.acc_seg: 90.4051 2024/10/26 10:45:14 - mmengine - INFO - Iter(train) [68700/80000] base_lr: 2.2122e-05 lr: 2.2122e-05 eta: 6:08:36 time: 1.9418 data_time: 0.0196 memory: 6645 grad_norm: 4.8887 loss: 0.4881 decode.loss_ce: 0.3332 decode.acc_seg: 77.1834 aux.loss_ce: 0.1550 aux.acc_seg: 74.1149 2024/10/26 10:46:52 - mmengine - INFO - Iter(train) [68750/80000] base_lr: 2.1940e-05 lr: 2.1940e-05 eta: 6:06:59 time: 1.9456 data_time: 0.0180 memory: 6645 grad_norm: 4.1323 loss: 0.4055 decode.loss_ce: 0.2651 decode.acc_seg: 92.4549 aux.loss_ce: 0.1404 aux.acc_seg: 91.2452 2024/10/26 10:48:30 - mmengine - INFO - Iter(train) [68800/80000] base_lr: 2.1758e-05 lr: 2.1758e-05 eta: 6:05:21 time: 1.9561 data_time: 0.0178 memory: 6645 grad_norm: 3.3078 loss: 0.4685 decode.loss_ce: 0.3057 decode.acc_seg: 87.0931 aux.loss_ce: 0.1628 aux.acc_seg: 84.4574 2024/10/26 10:50:08 - mmengine - INFO - Iter(train) [68850/80000] base_lr: 2.1577e-05 lr: 2.1577e-05 eta: 6:03:43 time: 1.9602 data_time: 0.0188 memory: 6645 grad_norm: 5.1097 loss: 0.4666 decode.loss_ce: 0.3090 decode.acc_seg: 85.4434 aux.loss_ce: 0.1576 aux.acc_seg: 84.0127 2024/10/26 10:51:46 - mmengine - INFO - Iter(train) [68900/80000] base_lr: 2.1396e-05 lr: 2.1396e-05 eta: 6:02:05 time: 1.9707 data_time: 0.0188 memory: 6645 grad_norm: 4.9317 loss: 0.4619 decode.loss_ce: 0.3118 decode.acc_seg: 89.5379 aux.loss_ce: 0.1501 aux.acc_seg: 84.1994 2024/10/26 10:53:23 - mmengine - INFO - Iter(train) [68950/80000] base_lr: 2.1216e-05 lr: 2.1216e-05 eta: 6:00:27 time: 1.9552 data_time: 0.0192 memory: 6646 grad_norm: 3.0467 loss: 0.4754 decode.loss_ce: 0.3129 decode.acc_seg: 84.6394 aux.loss_ce: 0.1625 aux.acc_seg: 82.6544 2024/10/26 10:55:01 - mmengine - INFO - Exp name: deeplabv3_mobilemamba_b4-80k_ade20k-512x512_20241024_212005 2024/10/26 10:55:01 - mmengine - INFO - Iter(train) [69000/80000] base_lr: 2.1037e-05 lr: 2.1037e-05 eta: 5:58:49 time: 1.9478 data_time: 0.0191 memory: 6645 grad_norm: 4.8824 loss: 0.4944 decode.loss_ce: 0.3209 decode.acc_seg: 77.8384 aux.loss_ce: 0.1735 aux.acc_seg: 71.0683 2024/10/26 10:56:39 - mmengine - INFO - Iter(train) [69050/80000] base_lr: 2.0858e-05 lr: 2.0858e-05 eta: 5:57:11 time: 1.9470 data_time: 0.0193 memory: 6646 grad_norm: 3.1464 loss: 0.4405 decode.loss_ce: 0.2968 decode.acc_seg: 90.4667 aux.loss_ce: 0.1437 aux.acc_seg: 87.5230 2024/10/26 10:58:16 - mmengine - INFO - Iter(train) [69100/80000] base_lr: 2.0680e-05 lr: 2.0680e-05 eta: 5:55:33 time: 1.9508 data_time: 0.0191 memory: 6646 grad_norm: 2.9096 loss: 0.5025 decode.loss_ce: 0.3343 decode.acc_seg: 88.2591 aux.loss_ce: 0.1682 aux.acc_seg: 86.8645 2024/10/26 10:59:54 - mmengine - INFO - Iter(train) [69150/80000] base_lr: 2.0502e-05 lr: 2.0502e-05 eta: 5:53:55 time: 1.9424 data_time: 0.0185 memory: 6645 grad_norm: 4.0850 loss: 0.5011 decode.loss_ce: 0.3197 decode.acc_seg: 87.6046 aux.loss_ce: 0.1813 aux.acc_seg: 79.8914 2024/10/26 11:01:32 - mmengine - INFO - Iter(train) [69200/80000] base_lr: 2.0325e-05 lr: 2.0325e-05 eta: 5:52:18 time: 1.9540 data_time: 0.0184 memory: 6645 grad_norm: 6.0538 loss: 0.4549 decode.loss_ce: 0.2933 decode.acc_seg: 89.8948 aux.loss_ce: 0.1616 aux.acc_seg: 87.1465 2024/10/26 11:03:09 - mmengine - INFO - Iter(train) [69250/80000] base_lr: 2.0148e-05 lr: 2.0148e-05 eta: 5:50:40 time: 1.9482 data_time: 0.0194 memory: 6645 grad_norm: 3.9474 loss: 0.4431 decode.loss_ce: 0.2899 decode.acc_seg: 94.4784 aux.loss_ce: 0.1532 aux.acc_seg: 93.8126 2024/10/26 11:04:47 - mmengine - INFO - Iter(train) [69300/80000] base_lr: 1.9973e-05 lr: 1.9973e-05 eta: 5:49:02 time: 1.9489 data_time: 0.0189 memory: 6644 grad_norm: 3.3680 loss: 0.4139 decode.loss_ce: 0.2789 decode.acc_seg: 90.2734 aux.loss_ce: 0.1351 aux.acc_seg: 90.2983 2024/10/26 11:06:25 - mmengine - INFO - Iter(train) [69350/80000] base_lr: 1.9797e-05 lr: 1.9797e-05 eta: 5:47:24 time: 1.9853 data_time: 0.0189 memory: 6646 grad_norm: 4.2511 loss: 0.4462 decode.loss_ce: 0.2965 decode.acc_seg: 85.7773 aux.loss_ce: 0.1497 aux.acc_seg: 85.0841 2024/10/26 11:08:03 - mmengine - INFO - Iter(train) [69400/80000] base_lr: 1.9623e-05 lr: 1.9623e-05 eta: 5:45:46 time: 1.9463 data_time: 0.0187 memory: 6646 grad_norm: 3.5509 loss: 0.4358 decode.loss_ce: 0.2975 decode.acc_seg: 88.0856 aux.loss_ce: 0.1383 aux.acc_seg: 88.2143 2024/10/26 11:09:43 - mmengine - INFO - Iter(train) [69450/80000] base_lr: 1.9449e-05 lr: 1.9449e-05 eta: 5:44:09 time: 1.9496 data_time: 0.0199 memory: 6645 grad_norm: 4.3470 loss: 0.4954 decode.loss_ce: 0.3273 decode.acc_seg: 89.3679 aux.loss_ce: 0.1681 aux.acc_seg: 86.5988 2024/10/26 11:11:21 - mmengine - INFO - Iter(train) [69500/80000] base_lr: 1.9275e-05 lr: 1.9275e-05 eta: 5:42:31 time: 1.9578 data_time: 0.0190 memory: 6646 grad_norm: 4.6301 loss: 0.5100 decode.loss_ce: 0.3475 decode.acc_seg: 89.5188 aux.loss_ce: 0.1625 aux.acc_seg: 87.4119 2024/10/26 11:12:58 - mmengine - INFO - Iter(train) [69550/80000] base_lr: 1.9103e-05 lr: 1.9103e-05 eta: 5:40:53 time: 1.9523 data_time: 0.0198 memory: 6645 grad_norm: 3.3294 loss: 0.4738 decode.loss_ce: 0.3191 decode.acc_seg: 88.2644 aux.loss_ce: 0.1547 aux.acc_seg: 87.8895 2024/10/26 11:14:36 - mmengine - INFO - Iter(train) [69600/80000] base_lr: 1.8931e-05 lr: 1.8931e-05 eta: 5:39:15 time: 1.9616 data_time: 0.0184 memory: 6645 grad_norm: 3.9569 loss: 0.4762 decode.loss_ce: 0.3108 decode.acc_seg: 84.1613 aux.loss_ce: 0.1654 aux.acc_seg: 81.2059 2024/10/26 11:16:14 - mmengine - INFO - Iter(train) [69650/80000] base_lr: 1.8759e-05 lr: 1.8759e-05 eta: 5:37:37 time: 1.9418 data_time: 0.0185 memory: 6646 grad_norm: 3.4969 loss: 0.4364 decode.loss_ce: 0.2837 decode.acc_seg: 90.9841 aux.loss_ce: 0.1526 aux.acc_seg: 89.6265 2024/10/26 11:17:51 - mmengine - INFO - Iter(train) [69700/80000] base_lr: 1.8588e-05 lr: 1.8588e-05 eta: 5:35:59 time: 1.9531 data_time: 0.0184 memory: 6646 grad_norm: 4.9607 loss: 0.4678 decode.loss_ce: 0.3077 decode.acc_seg: 89.8541 aux.loss_ce: 0.1600 aux.acc_seg: 85.8527 2024/10/26 11:19:30 - mmengine - INFO - Iter(train) [69750/80000] base_lr: 1.8418e-05 lr: 1.8418e-05 eta: 5:34:21 time: 1.9516 data_time: 0.0194 memory: 6646 grad_norm: 3.9933 loss: 0.4386 decode.loss_ce: 0.2977 decode.acc_seg: 93.8702 aux.loss_ce: 0.1408 aux.acc_seg: 92.0061 2024/10/26 11:21:07 - mmengine - INFO - Iter(train) [69800/80000] base_lr: 1.8249e-05 lr: 1.8249e-05 eta: 5:32:43 time: 1.9594 data_time: 0.0195 memory: 6646 grad_norm: 3.0683 loss: 0.4015 decode.loss_ce: 0.2723 decode.acc_seg: 89.9513 aux.loss_ce: 0.1292 aux.acc_seg: 86.9588 2024/10/26 11:22:45 - mmengine - INFO - Iter(train) [69850/80000] base_lr: 1.8080e-05 lr: 1.8080e-05 eta: 5:31:05 time: 1.9454 data_time: 0.0185 memory: 6647 grad_norm: 3.9018 loss: 0.4956 decode.loss_ce: 0.3247 decode.acc_seg: 90.2776 aux.loss_ce: 0.1709 aux.acc_seg: 86.6577 2024/10/26 11:24:22 - mmengine - INFO - Iter(train) [69900/80000] base_lr: 1.7911e-05 lr: 1.7911e-05 eta: 5:29:28 time: 1.9569 data_time: 0.0196 memory: 6645 grad_norm: 3.2419 loss: 0.4381 decode.loss_ce: 0.2884 decode.acc_seg: 94.0986 aux.loss_ce: 0.1497 aux.acc_seg: 93.2563 2024/10/26 11:26:01 - mmengine - INFO - Iter(train) [69950/80000] base_lr: 1.7744e-05 lr: 1.7744e-05 eta: 5:27:50 time: 1.9689 data_time: 0.0184 memory: 6646 grad_norm: 5.0497 loss: 0.4300 decode.loss_ce: 0.2806 decode.acc_seg: 80.5573 aux.loss_ce: 0.1494 aux.acc_seg: 78.3463 2024/10/26 11:27:38 - mmengine - INFO - Exp name: deeplabv3_mobilemamba_b4-80k_ade20k-512x512_20241024_212005 2024/10/26 11:27:38 - mmengine - INFO - Iter(train) [70000/80000] base_lr: 1.7577e-05 lr: 1.7577e-05 eta: 5:26:12 time: 1.9478 data_time: 0.0174 memory: 6646 grad_norm: 3.8541 loss: 0.4037 decode.loss_ce: 0.2679 decode.acc_seg: 87.8454 aux.loss_ce: 0.1358 aux.acc_seg: 87.0295 2024/10/26 11:29:16 - mmengine - INFO - Iter(train) [70050/80000] base_lr: 1.7411e-05 lr: 1.7411e-05 eta: 5:24:34 time: 1.9526 data_time: 0.0186 memory: 6645 grad_norm: 3.5482 loss: 0.5282 decode.loss_ce: 0.3393 decode.acc_seg: 83.7175 aux.loss_ce: 0.1889 aux.acc_seg: 81.3390 2024/10/26 11:30:54 - mmengine - INFO - Iter(train) [70100/80000] base_lr: 1.7245e-05 lr: 1.7245e-05 eta: 5:22:56 time: 1.9497 data_time: 0.0183 memory: 6646 grad_norm: 3.5428 loss: 0.4919 decode.loss_ce: 0.3354 decode.acc_seg: 83.6499 aux.loss_ce: 0.1564 aux.acc_seg: 81.4003 2024/10/26 11:32:32 - mmengine - INFO - Iter(train) [70150/80000] base_lr: 1.7080e-05 lr: 1.7080e-05 eta: 5:21:18 time: 1.9485 data_time: 0.0189 memory: 6645 grad_norm: 3.5656 loss: 0.4537 decode.loss_ce: 0.3107 decode.acc_seg: 87.9230 aux.loss_ce: 0.1431 aux.acc_seg: 87.8288 2024/10/26 11:34:09 - mmengine - INFO - Iter(train) [70200/80000] base_lr: 1.6916e-05 lr: 1.6916e-05 eta: 5:19:40 time: 1.9472 data_time: 0.0184 memory: 6645 grad_norm: 4.1345 loss: 0.4845 decode.loss_ce: 0.3186 decode.acc_seg: 85.2034 aux.loss_ce: 0.1659 aux.acc_seg: 79.6777 2024/10/26 11:35:47 - mmengine - INFO - Iter(train) [70250/80000] base_lr: 1.6752e-05 lr: 1.6752e-05 eta: 5:18:02 time: 1.9495 data_time: 0.0195 memory: 6645 grad_norm: 2.9798 loss: 0.4417 decode.loss_ce: 0.2983 decode.acc_seg: 84.8063 aux.loss_ce: 0.1434 aux.acc_seg: 80.9451 2024/10/26 11:37:25 - mmengine - INFO - Iter(train) [70300/80000] base_lr: 1.6589e-05 lr: 1.6589e-05 eta: 5:16:25 time: 1.9592 data_time: 0.0186 memory: 6645 grad_norm: 4.5004 loss: 0.4333 decode.loss_ce: 0.2872 decode.acc_seg: 81.5594 aux.loss_ce: 0.1460 aux.acc_seg: 78.8114 2024/10/26 11:39:02 - mmengine - INFO - Iter(train) [70350/80000] base_lr: 1.6427e-05 lr: 1.6427e-05 eta: 5:14:47 time: 1.9449 data_time: 0.0196 memory: 6644 grad_norm: 3.7268 loss: 0.4268 decode.loss_ce: 0.2791 decode.acc_seg: 87.5092 aux.loss_ce: 0.1477 aux.acc_seg: 86.9018 2024/10/26 11:40:43 - mmengine - INFO - Iter(train) [70400/80000] base_lr: 1.6265e-05 lr: 1.6265e-05 eta: 5:13:09 time: 1.9459 data_time: 0.0192 memory: 6645 grad_norm: 3.3383 loss: 0.4499 decode.loss_ce: 0.2971 decode.acc_seg: 89.0112 aux.loss_ce: 0.1528 aux.acc_seg: 89.1952 2024/10/26 11:42:21 - mmengine - INFO - Iter(train) [70450/80000] base_lr: 1.6104e-05 lr: 1.6104e-05 eta: 5:11:31 time: 1.9399 data_time: 0.0178 memory: 6646 grad_norm: 2.8446 loss: 0.4556 decode.loss_ce: 0.2924 decode.acc_seg: 88.2775 aux.loss_ce: 0.1632 aux.acc_seg: 89.0125 2024/10/26 11:43:58 - mmengine - INFO - Iter(train) [70500/80000] base_lr: 1.5944e-05 lr: 1.5944e-05 eta: 5:09:53 time: 1.9507 data_time: 0.0196 memory: 6646 grad_norm: 3.3741 loss: 0.4623 decode.loss_ce: 0.3034 decode.acc_seg: 87.0969 aux.loss_ce: 0.1588 aux.acc_seg: 81.9362 2024/10/26 11:45:37 - mmengine - INFO - Iter(train) [70550/80000] base_lr: 1.5784e-05 lr: 1.5784e-05 eta: 5:08:16 time: 1.9513 data_time: 0.0198 memory: 6645 grad_norm: 4.7154 loss: 0.4601 decode.loss_ce: 0.3001 decode.acc_seg: 88.0868 aux.loss_ce: 0.1599 aux.acc_seg: 88.6847 2024/10/26 11:47:15 - mmengine - INFO - Iter(train) [70600/80000] base_lr: 1.5625e-05 lr: 1.5625e-05 eta: 5:06:38 time: 1.9557 data_time: 0.0184 memory: 6647 grad_norm: 4.2222 loss: 0.4078 decode.loss_ce: 0.2676 decode.acc_seg: 87.4448 aux.loss_ce: 0.1402 aux.acc_seg: 88.4873 2024/10/26 11:48:52 - mmengine - INFO - Iter(train) [70650/80000] base_lr: 1.5467e-05 lr: 1.5467e-05 eta: 5:05:00 time: 1.9447 data_time: 0.0191 memory: 6646 grad_norm: 2.9342 loss: 0.4718 decode.loss_ce: 0.3162 decode.acc_seg: 84.4240 aux.loss_ce: 0.1556 aux.acc_seg: 86.1971 2024/10/26 11:50:30 - mmengine - INFO - Iter(train) [70700/80000] base_lr: 1.5310e-05 lr: 1.5310e-05 eta: 5:03:22 time: 1.9445 data_time: 0.0190 memory: 6645 grad_norm: 3.3063 loss: 0.3711 decode.loss_ce: 0.2416 decode.acc_seg: 91.4195 aux.loss_ce: 0.1295 aux.acc_seg: 84.2815 2024/10/26 11:52:07 - mmengine - INFO - Iter(train) [70750/80000] base_lr: 1.5153e-05 lr: 1.5153e-05 eta: 5:01:44 time: 1.9497 data_time: 0.0172 memory: 6646 grad_norm: 3.5177 loss: 0.4317 decode.loss_ce: 0.2949 decode.acc_seg: 84.8665 aux.loss_ce: 0.1368 aux.acc_seg: 79.6573 2024/10/26 11:53:44 - mmengine - INFO - Iter(train) [70800/80000] base_lr: 1.4996e-05 lr: 1.4996e-05 eta: 5:00:06 time: 1.9513 data_time: 0.0193 memory: 6645 grad_norm: 3.1483 loss: 0.4784 decode.loss_ce: 0.3142 decode.acc_seg: 89.2967 aux.loss_ce: 0.1642 aux.acc_seg: 87.7418 2024/10/26 11:55:22 - mmengine - INFO - Iter(train) [70850/80000] base_lr: 1.4841e-05 lr: 1.4841e-05 eta: 4:58:28 time: 1.9560 data_time: 0.0177 memory: 6646 grad_norm: 4.3535 loss: 0.4191 decode.loss_ce: 0.2750 decode.acc_seg: 91.9282 aux.loss_ce: 0.1441 aux.acc_seg: 93.2066 2024/10/26 11:57:00 - mmengine - INFO - Iter(train) [70900/80000] base_lr: 1.4686e-05 lr: 1.4686e-05 eta: 4:56:50 time: 1.9503 data_time: 0.0181 memory: 6646 grad_norm: 2.7551 loss: 0.4533 decode.loss_ce: 0.2986 decode.acc_seg: 85.7652 aux.loss_ce: 0.1548 aux.acc_seg: 85.9494 2024/10/26 11:58:38 - mmengine - INFO - Iter(train) [70950/80000] base_lr: 1.4532e-05 lr: 1.4532e-05 eta: 4:55:12 time: 1.9466 data_time: 0.0190 memory: 6645 grad_norm: 3.5381 loss: 0.3811 decode.loss_ce: 0.2515 decode.acc_seg: 90.2763 aux.loss_ce: 0.1296 aux.acc_seg: 89.4672 2024/10/26 12:00:15 - mmengine - INFO - Exp name: deeplabv3_mobilemamba_b4-80k_ade20k-512x512_20241024_212005 2024/10/26 12:00:15 - mmengine - INFO - Iter(train) [71000/80000] base_lr: 1.4379e-05 lr: 1.4379e-05 eta: 4:53:35 time: 1.9556 data_time: 0.0187 memory: 6645 grad_norm: 3.4380 loss: 0.4568 decode.loss_ce: 0.2987 decode.acc_seg: 87.9192 aux.loss_ce: 0.1581 aux.acc_seg: 85.7974 2024/10/26 12:01:53 - mmengine - INFO - Iter(train) [71050/80000] base_lr: 1.4226e-05 lr: 1.4226e-05 eta: 4:51:57 time: 1.9478 data_time: 0.0181 memory: 6645 grad_norm: 3.1865 loss: 0.4221 decode.loss_ce: 0.2708 decode.acc_seg: 92.7875 aux.loss_ce: 0.1512 aux.acc_seg: 91.6308 2024/10/26 12:03:30 - mmengine - INFO - Iter(train) [71100/80000] base_lr: 1.4074e-05 lr: 1.4074e-05 eta: 4:50:19 time: 1.9457 data_time: 0.0190 memory: 6646 grad_norm: 4.4000 loss: 0.4841 decode.loss_ce: 0.3278 decode.acc_seg: 89.2478 aux.loss_ce: 0.1563 aux.acc_seg: 87.6426 2024/10/26 12:05:08 - mmengine - INFO - Iter(train) [71150/80000] base_lr: 1.3923e-05 lr: 1.3923e-05 eta: 4:48:41 time: 1.9494 data_time: 0.0184 memory: 6647 grad_norm: 4.1476 loss: 0.4905 decode.loss_ce: 0.3272 decode.acc_seg: 87.9237 aux.loss_ce: 0.1633 aux.acc_seg: 86.1779 2024/10/26 12:06:46 - mmengine - INFO - Iter(train) [71200/80000] base_lr: 1.3772e-05 lr: 1.3772e-05 eta: 4:47:03 time: 1.9488 data_time: 0.0190 memory: 6645 grad_norm: 3.1063 loss: 0.4069 decode.loss_ce: 0.2729 decode.acc_seg: 88.0078 aux.loss_ce: 0.1340 aux.acc_seg: 90.1534 2024/10/26 12:08:23 - mmengine - INFO - Iter(train) [71250/80000] base_lr: 1.3622e-05 lr: 1.3622e-05 eta: 4:45:25 time: 1.9513 data_time: 0.0189 memory: 6645 grad_norm: 3.9830 loss: 0.5017 decode.loss_ce: 0.3405 decode.acc_seg: 84.3575 aux.loss_ce: 0.1613 aux.acc_seg: 84.6633 2024/10/26 12:10:01 - mmengine - INFO - Iter(train) [71300/80000] base_lr: 1.3473e-05 lr: 1.3473e-05 eta: 4:43:47 time: 1.9471 data_time: 0.0183 memory: 6647 grad_norm: 5.3360 loss: 0.4784 decode.loss_ce: 0.3282 decode.acc_seg: 82.1652 aux.loss_ce: 0.1502 aux.acc_seg: 82.7909 2024/10/26 12:11:43 - mmengine - INFO - Iter(train) [71350/80000] base_lr: 1.3325e-05 lr: 1.3325e-05 eta: 4:42:10 time: 1.9423 data_time: 0.0173 memory: 6645 grad_norm: 3.5690 loss: 0.4353 decode.loss_ce: 0.2811 decode.acc_seg: 87.7219 aux.loss_ce: 0.1542 aux.acc_seg: 84.6045 2024/10/26 12:13:20 - mmengine - INFO - Iter(train) [71400/80000] base_lr: 1.3177e-05 lr: 1.3177e-05 eta: 4:40:32 time: 1.9442 data_time: 0.0181 memory: 6645 grad_norm: 3.3073 loss: 0.4144 decode.loss_ce: 0.2838 decode.acc_seg: 89.2393 aux.loss_ce: 0.1305 aux.acc_seg: 85.1971 2024/10/26 12:14:58 - mmengine - INFO - Iter(train) [71450/80000] base_lr: 1.3030e-05 lr: 1.3030e-05 eta: 4:38:54 time: 1.9465 data_time: 0.0184 memory: 6647 grad_norm: 3.2379 loss: 0.4571 decode.loss_ce: 0.3083 decode.acc_seg: 89.3789 aux.loss_ce: 0.1488 aux.acc_seg: 85.4498 2024/10/26 12:16:35 - mmengine - INFO - Iter(train) [71500/80000] base_lr: 1.2884e-05 lr: 1.2884e-05 eta: 4:37:16 time: 1.9493 data_time: 0.0204 memory: 6646 grad_norm: 3.6858 loss: 0.4623 decode.loss_ce: 0.3099 decode.acc_seg: 85.4775 aux.loss_ce: 0.1524 aux.acc_seg: 82.7251 2024/10/26 12:18:13 - mmengine - INFO - Iter(train) [71550/80000] base_lr: 1.2738e-05 lr: 1.2738e-05 eta: 4:35:38 time: 1.9670 data_time: 0.0182 memory: 6645 grad_norm: 4.0963 loss: 0.4656 decode.loss_ce: 0.3129 decode.acc_seg: 87.7565 aux.loss_ce: 0.1528 aux.acc_seg: 86.4171 2024/10/26 12:19:51 - mmengine - INFO - Iter(train) [71600/80000] base_lr: 1.2594e-05 lr: 1.2594e-05 eta: 4:34:00 time: 1.9542 data_time: 0.0202 memory: 6646 grad_norm: 2.9747 loss: 0.4090 decode.loss_ce: 0.2721 decode.acc_seg: 88.1212 aux.loss_ce: 0.1369 aux.acc_seg: 88.3422 2024/10/26 12:21:29 - mmengine - INFO - Iter(train) [71650/80000] base_lr: 1.2450e-05 lr: 1.2450e-05 eta: 4:32:23 time: 1.9462 data_time: 0.0199 memory: 6647 grad_norm: 2.6385 loss: 0.3806 decode.loss_ce: 0.2519 decode.acc_seg: 90.9166 aux.loss_ce: 0.1286 aux.acc_seg: 91.0194 2024/10/26 12:23:06 - mmengine - INFO - Iter(train) [71700/80000] base_lr: 1.2306e-05 lr: 1.2306e-05 eta: 4:30:45 time: 1.9502 data_time: 0.0195 memory: 6646 grad_norm: 3.7928 loss: 0.4877 decode.loss_ce: 0.3202 decode.acc_seg: 89.6733 aux.loss_ce: 0.1675 aux.acc_seg: 89.5108 2024/10/26 12:24:44 - mmengine - INFO - Iter(train) [71750/80000] base_lr: 1.2164e-05 lr: 1.2164e-05 eta: 4:29:07 time: 1.9591 data_time: 0.0178 memory: 6645 grad_norm: 4.2544 loss: 0.5296 decode.loss_ce: 0.3487 decode.acc_seg: 90.4583 aux.loss_ce: 0.1810 aux.acc_seg: 88.8135 2024/10/26 12:26:22 - mmengine - INFO - Iter(train) [71800/80000] base_lr: 1.2022e-05 lr: 1.2022e-05 eta: 4:27:29 time: 1.9541 data_time: 0.0198 memory: 6646 grad_norm: 4.0158 loss: 0.5179 decode.loss_ce: 0.3362 decode.acc_seg: 85.3834 aux.loss_ce: 0.1817 aux.acc_seg: 84.0874 2024/10/26 12:28:00 - mmengine - INFO - Iter(train) [71850/80000] base_lr: 1.1881e-05 lr: 1.1881e-05 eta: 4:25:51 time: 1.9561 data_time: 0.0189 memory: 6645 grad_norm: 2.8682 loss: 0.3799 decode.loss_ce: 0.2522 decode.acc_seg: 90.5158 aux.loss_ce: 0.1277 aux.acc_seg: 89.0733 2024/10/26 12:29:38 - mmengine - INFO - Iter(train) [71900/80000] base_lr: 1.1740e-05 lr: 1.1740e-05 eta: 4:24:13 time: 1.9565 data_time: 0.0188 memory: 6646 grad_norm: 2.9344 loss: 0.4128 decode.loss_ce: 0.2682 decode.acc_seg: 84.3468 aux.loss_ce: 0.1446 aux.acc_seg: 84.1320 2024/10/26 12:31:16 - mmengine - INFO - Iter(train) [71950/80000] base_lr: 1.1601e-05 lr: 1.1601e-05 eta: 4:22:35 time: 1.9463 data_time: 0.0189 memory: 6646 grad_norm: 3.1089 loss: 0.3982 decode.loss_ce: 0.2586 decode.acc_seg: 91.9477 aux.loss_ce: 0.1396 aux.acc_seg: 90.1050 2024/10/26 12:32:54 - mmengine - INFO - Exp name: deeplabv3_mobilemamba_b4-80k_ade20k-512x512_20241024_212005 2024/10/26 12:32:54 - mmengine - INFO - Iter(train) [72000/80000] base_lr: 1.1462e-05 lr: 1.1462e-05 eta: 4:20:57 time: 1.9560 data_time: 0.0192 memory: 6646 grad_norm: 3.4342 loss: 0.4406 decode.loss_ce: 0.2885 decode.acc_seg: 86.0711 aux.loss_ce: 0.1520 aux.acc_seg: 79.2593 2024/10/26 12:32:54 - mmengine - INFO - Saving checkpoint at 72000 iterations 2024/10/26 12:32:58 - mmengine - INFO - Iter(val) [ 50/500] eta: 0:00:15 time: 0.0360 data_time: 0.0020 memory: 1049 2024/10/26 12:33:00 - mmengine - INFO - Iter(val) [100/500] eta: 0:00:13 time: 0.0326 data_time: 0.0015 memory: 1117 2024/10/26 12:33:01 - mmengine - INFO - Iter(val) [150/500] eta: 0:00:11 time: 0.0347 data_time: 0.0018 memory: 833 2024/10/26 12:33:03 - mmengine - INFO - Iter(val) [200/500] eta: 0:00:10 time: 0.0338 data_time: 0.0018 memory: 866 2024/10/26 12:33:05 - mmengine - INFO - Iter(val) [250/500] eta: 0:00:08 time: 0.0365 data_time: 0.0020 memory: 906 2024/10/26 12:33:06 - mmengine - INFO - Iter(val) [300/500] eta: 0:00:06 time: 0.0335 data_time: 0.0021 memory: 2028 2024/10/26 12:33:08 - mmengine - INFO - Iter(val) [350/500] eta: 0:00:05 time: 0.0336 data_time: 0.0017 memory: 832 2024/10/26 12:33:10 - mmengine - INFO - Iter(val) [400/500] eta: 0:00:03 time: 0.0339 data_time: 0.0018 memory: 904 2024/10/26 12:33:11 - mmengine - INFO - Iter(val) [450/500] eta: 0:00:01 time: 0.0328 data_time: 0.0015 memory: 839 2024/10/26 12:33:13 - mmengine - INFO - Iter(val) [500/500] eta: 0:00:00 time: 0.0324 data_time: 0.0014 memory: 889 2024/10/26 12:33:14 - mmengine - INFO - per class results: 2024/10/26 12:33:15 - mmengine - INFO - +---------------------+-------+-------+ | Class | IoU | Acc | +---------------------+-------+-------+ | wall | 66.96 | 84.18 | | building | 76.52 | 89.95 | | sky | 88.69 | 94.55 | | floor | 71.21 | 86.73 | | tree | 65.43 | 84.28 | | ceiling | 76.69 | 87.03 | | road | 77.52 | 87.99 | | bed | 78.93 | 88.3 | | windowpane | 52.33 | 70.26 | | grass | 59.9 | 79.14 | | cabinet | 52.35 | 64.51 | | sidewalk | 55.55 | 72.11 | | person | 59.73 | 77.03 | | earth | 31.61 | 45.27 | | door | 36.69 | 50.09 | | table | 42.42 | 60.95 | | mountain | 53.53 | 65.87 | | plant | 39.82 | 49.57 | | curtain | 56.74 | 71.06 | | chair | 38.55 | 53.11 | | car | 69.42 | 81.93 | | water | 41.46 | 53.87 | | painting | 51.78 | 65.36 | | sofa | 56.01 | 74.7 | | shelf | 31.88 | 46.51 | | house | 44.02 | 52.78 | | sea | 43.87 | 67.5 | | mirror | 53.23 | 61.92 | | rug | 47.86 | 58.64 | | field | 26.17 | 36.75 | | armchair | 32.82 | 48.62 | | seat | 55.84 | 74.12 | | fence | 34.54 | 47.31 | | desk | 38.24 | 50.8 | | rock | 35.93 | 57.81 | | wardrobe | 45.31 | 57.85 | | lamp | 32.45 | 42.25 | | bathtub | 68.32 | 76.94 | | railing | 24.53 | 35.63 | | cushion | 32.06 | 41.37 | | base | 23.55 | 34.92 | | box | 13.91 | 20.62 | | column | 26.76 | 35.82 | | signboard | 18.02 | 26.49 | | chest of drawers | 32.54 | 47.49 | | counter | 27.83 | 34.86 | | sand | 32.28 | 51.12 | | sink | 52.1 | 64.97 | | skyscraper | 37.45 | 48.44 | | fireplace | 59.88 | 74.28 | | refrigerator | 68.95 | 80.4 | | grandstand | 38.98 | 61.31 | | path | 19.75 | 31.26 | | stairs | 18.29 | 25.26 | | runway | 66.81 | 81.06 | | case | 44.53 | 61.24 | | pool table | 76.17 | 83.44 | | pillow | 43.09 | 54.75 | | screen door | 59.66 | 68.22 | | stairway | 17.71 | 28.47 | | river | 11.21 | 23.5 | | bridge | 45.32 | 55.19 | | bookcase | 28.3 | 39.36 | | blind | 22.62 | 25.2 | | coffee table | 47.73 | 64.05 | | toilet | 65.61 | 74.84 | | flower | 25.47 | 40.15 | | book | 32.21 | 49.46 | | hill | 12.07 | 19.09 | | bench | 30.59 | 37.64 | | countertop | 47.71 | 62.6 | | stove | 51.39 | 65.6 | | palm | 30.51 | 51.42 | | kitchen island | 29.11 | 53.74 | | computer | 45.7 | 54.64 | | swivel chair | 34.12 | 44.92 | | boat | 42.39 | 57.24 | | bar | 20.91 | 27.38 | | arcade machine | 21.34 | 22.58 | | hovel | 30.56 | 35.52 | | bus | 77.56 | 88.72 | | towel | 41.08 | 51.05 | | light | 10.19 | 11.1 | | truck | 14.02 | 20.63 | | tower | 29.72 | 54.82 | | chandelier | 48.13 | 62.6 | | awning | 12.93 | 15.65 | | streetlight | 6.39 | 7.52 | | booth | 39.08 | 40.31 | | television receiver | 48.93 | 62.84 | | airplane | 39.86 | 48.97 | | dirt track | 15.64 | 40.26 | | apparel | 31.59 | 45.75 | | pole | 3.82 | 4.64 | | land | 0.0 | 0.01 | | bannister | 1.54 | 1.86 | | escalator | 43.86 | 75.72 | | ottoman | 26.08 | 39.03 | | bottle | 23.09 | 39.6 | | buffet | 44.92 | 50.14 | | poster | 23.55 | 31.21 | | stage | 9.15 | 14.83 | | van | 35.33 | 47.56 | | ship | 60.21 | 72.29 | | fountain | 18.37 | 18.42 | | conveyer belt | 57.97 | 76.73 | | canopy | 15.39 | 20.21 | | washer | 51.63 | 54.49 | | plaything | 8.07 | 14.8 | | swimming pool | 76.44 | 79.39 | | stool | 24.21 | 31.4 | | barrel | 40.75 | 64.54 | | basket | 12.8 | 16.61 | | waterfall | 48.15 | 59.89 | | tent | 89.92 | 95.06 | | bag | 4.82 | 5.72 | | minibike | 33.96 | 44.77 | | cradle | 67.27 | 84.14 | | oven | 39.87 | 57.6 | | ball | 32.36 | 42.61 | | food | 35.61 | 47.94 | | step | 12.5 | 15.38 | | tank | 37.82 | 39.31 | | trade name | 9.36 | 10.02 | | microwave | 34.58 | 38.64 | | pot | 22.96 | 28.39 | | animal | 36.07 | 39.04 | | bicycle | 25.61 | 47.62 | | lake | 27.88 | 52.91 | | dishwasher | 47.66 | 54.1 | | screen | 53.78 | 68.63 | | blanket | 6.2 | 6.84 | | sculpture | 27.46 | 38.97 | | hood | 42.24 | 48.0 | | sconce | 16.01 | 18.48 | | vase | 12.02 | 17.4 | | traffic light | 9.01 | 12.15 | | tray | 1.62 | 3.46 | | ashcan | 25.83 | 32.29 | | fan | 26.33 | 33.52 | | pier | 21.53 | 32.57 | | crt screen | 3.58 | 8.67 | | plate | 20.48 | 27.11 | | monitor | 1.45 | 1.83 | | bulletin board | 33.3 | 39.55 | | shower | 0.0 | 0.0 | | radiator | 31.52 | 35.75 | | glass | 0.77 | 0.81 | | clock | 18.43 | 21.11 | | flag | 19.07 | 20.33 | +---------------------+-------+-------+ 2024/10/26 12:33:15 - mmengine - INFO - Iter(val) [500/500] aAcc: 76.1200 mIoU: 36.4600 mAcc: 46.9800 data_time: 0.0018 time: 0.0338 2024/10/26 12:34:52 - mmengine - INFO - Iter(train) [72050/80000] base_lr: 1.1324e-05 lr: 1.1324e-05 eta: 4:19:20 time: 1.9437 data_time: 0.0195 memory: 6646 grad_norm: 6.5352 loss: 0.4805 decode.loss_ce: 0.3118 decode.acc_seg: 87.5253 aux.loss_ce: 0.1686 aux.acc_seg: 86.8854 2024/10/26 12:36:30 - mmengine - INFO - Iter(train) [72100/80000] base_lr: 1.1186e-05 lr: 1.1186e-05 eta: 4:17:42 time: 1.9511 data_time: 0.0191 memory: 6646 grad_norm: 3.5906 loss: 0.4917 decode.loss_ce: 0.3219 decode.acc_seg: 85.7943 aux.loss_ce: 0.1698 aux.acc_seg: 85.1772 2024/10/26 12:38:07 - mmengine - INFO - Iter(train) [72150/80000] base_lr: 1.1050e-05 lr: 1.1050e-05 eta: 4:16:04 time: 1.9519 data_time: 0.0189 memory: 6646 grad_norm: 3.7648 loss: 0.4545 decode.loss_ce: 0.3101 decode.acc_seg: 87.7428 aux.loss_ce: 0.1445 aux.acc_seg: 89.0269 2024/10/26 12:39:45 - mmengine - INFO - Iter(train) [72200/80000] base_lr: 1.0914e-05 lr: 1.0914e-05 eta: 4:14:26 time: 1.9499 data_time: 0.0188 memory: 6645 grad_norm: 2.2136 loss: 0.3808 decode.loss_ce: 0.2537 decode.acc_seg: 87.7457 aux.loss_ce: 0.1271 aux.acc_seg: 89.3546 2024/10/26 12:41:22 - mmengine - INFO - Iter(train) [72250/80000] base_lr: 1.0779e-05 lr: 1.0779e-05 eta: 4:12:48 time: 1.9532 data_time: 0.0197 memory: 6644 grad_norm: 3.3111 loss: 0.4340 decode.loss_ce: 0.2847 decode.acc_seg: 92.2811 aux.loss_ce: 0.1493 aux.acc_seg: 90.2996 2024/10/26 12:43:00 - mmengine - INFO - Iter(train) [72300/80000] base_lr: 1.0644e-05 lr: 1.0644e-05 eta: 4:11:10 time: 1.9451 data_time: 0.0191 memory: 6645 grad_norm: 4.3316 loss: 0.4709 decode.loss_ce: 0.3134 decode.acc_seg: 90.6732 aux.loss_ce: 0.1575 aux.acc_seg: 90.5468 2024/10/26 12:44:38 - mmengine - INFO - Iter(train) [72350/80000] base_lr: 1.0511e-05 lr: 1.0511e-05 eta: 4:09:32 time: 1.9547 data_time: 0.0204 memory: 6644 grad_norm: 2.9459 loss: 0.4352 decode.loss_ce: 0.2906 decode.acc_seg: 80.5025 aux.loss_ce: 0.1445 aux.acc_seg: 80.1933 2024/10/26 12:46:16 - mmengine - INFO - Iter(train) [72400/80000] base_lr: 1.0378e-05 lr: 1.0378e-05 eta: 4:07:54 time: 1.9499 data_time: 0.0193 memory: 6646 grad_norm: 3.2237 loss: 0.4245 decode.loss_ce: 0.2823 decode.acc_seg: 91.1619 aux.loss_ce: 0.1422 aux.acc_seg: 85.9781 2024/10/26 12:47:53 - mmengine - INFO - Iter(train) [72450/80000] base_lr: 1.0246e-05 lr: 1.0246e-05 eta: 4:06:17 time: 1.9720 data_time: 0.0188 memory: 6645 grad_norm: 3.6967 loss: 0.4088 decode.loss_ce: 0.2763 decode.acc_seg: 89.1660 aux.loss_ce: 0.1325 aux.acc_seg: 89.0065 2024/10/26 12:49:31 - mmengine - INFO - Iter(train) [72500/80000] base_lr: 1.0114e-05 lr: 1.0114e-05 eta: 4:04:39 time: 1.9553 data_time: 0.0181 memory: 6645 grad_norm: 3.3805 loss: 0.4925 decode.loss_ce: 0.3228 decode.acc_seg: 86.8179 aux.loss_ce: 0.1697 aux.acc_seg: 88.2430 2024/10/26 12:51:09 - mmengine - INFO - Iter(train) [72550/80000] base_lr: 9.9839e-06 lr: 9.9839e-06 eta: 4:03:01 time: 1.9441 data_time: 0.0178 memory: 6645 grad_norm: 5.2693 loss: 0.4734 decode.loss_ce: 0.3172 decode.acc_seg: 80.0587 aux.loss_ce: 0.1562 aux.acc_seg: 77.9660 2024/10/26 12:52:47 - mmengine - INFO - Iter(train) [72600/80000] base_lr: 9.8541e-06 lr: 9.8541e-06 eta: 4:01:23 time: 1.9431 data_time: 0.0186 memory: 6645 grad_norm: 3.7684 loss: 0.4906 decode.loss_ce: 0.3161 decode.acc_seg: 90.4614 aux.loss_ce: 0.1744 aux.acc_seg: 91.1923 2024/10/26 12:54:24 - mmengine - INFO - Iter(train) [72650/80000] base_lr: 9.7252e-06 lr: 9.7252e-06 eta: 3:59:45 time: 1.9469 data_time: 0.0190 memory: 6646 grad_norm: 2.8073 loss: 0.3973 decode.loss_ce: 0.2631 decode.acc_seg: 82.0350 aux.loss_ce: 0.1342 aux.acc_seg: 83.1499 2024/10/26 12:56:02 - mmengine - INFO - Iter(train) [72700/80000] base_lr: 9.5969e-06 lr: 9.5969e-06 eta: 3:58:07 time: 1.9453 data_time: 0.0189 memory: 6646 grad_norm: 3.3232 loss: 0.4836 decode.loss_ce: 0.3271 decode.acc_seg: 84.1291 aux.loss_ce: 0.1565 aux.acc_seg: 84.2763 2024/10/26 12:57:43 - mmengine - INFO - Iter(train) [72750/80000] base_lr: 9.4695e-06 lr: 9.4695e-06 eta: 3:56:30 time: 1.9491 data_time: 0.0207 memory: 6646 grad_norm: 5.0550 loss: 0.4489 decode.loss_ce: 0.2945 decode.acc_seg: 87.4037 aux.loss_ce: 0.1544 aux.acc_seg: 87.8691 2024/10/26 12:59:21 - mmengine - INFO - Iter(train) [72800/80000] base_lr: 9.3428e-06 lr: 9.3428e-06 eta: 3:54:52 time: 1.9558 data_time: 0.0199 memory: 6645 grad_norm: 4.7000 loss: 0.4481 decode.loss_ce: 0.3010 decode.acc_seg: 86.1624 aux.loss_ce: 0.1471 aux.acc_seg: 85.7162 2024/10/26 13:00:59 - mmengine - INFO - Iter(train) [72850/80000] base_lr: 9.2170e-06 lr: 9.2170e-06 eta: 3:53:14 time: 1.9497 data_time: 0.0193 memory: 6645 grad_norm: 3.7468 loss: 0.4152 decode.loss_ce: 0.2671 decode.acc_seg: 80.6374 aux.loss_ce: 0.1481 aux.acc_seg: 76.7835 2024/10/26 13:02:36 - mmengine - INFO - Iter(train) [72900/80000] base_lr: 9.0919e-06 lr: 9.0919e-06 eta: 3:51:36 time: 1.9494 data_time: 0.0180 memory: 6645 grad_norm: 2.5523 loss: 0.4672 decode.loss_ce: 0.3108 decode.acc_seg: 89.8960 aux.loss_ce: 0.1564 aux.acc_seg: 86.8767 2024/10/26 13:04:14 - mmengine - INFO - Iter(train) [72950/80000] base_lr: 8.9676e-06 lr: 8.9676e-06 eta: 3:49:58 time: 1.9480 data_time: 0.0182 memory: 6645 grad_norm: 4.3493 loss: 0.4343 decode.loss_ce: 0.2890 decode.acc_seg: 90.3178 aux.loss_ce: 0.1452 aux.acc_seg: 89.5149 2024/10/26 13:05:51 - mmengine - INFO - Exp name: deeplabv3_mobilemamba_b4-80k_ade20k-512x512_20241024_212005 2024/10/26 13:05:51 - mmengine - INFO - Iter(train) [73000/80000] base_lr: 8.8441e-06 lr: 8.8441e-06 eta: 3:48:20 time: 1.9498 data_time: 0.0184 memory: 6645 grad_norm: 4.4611 loss: 0.4263 decode.loss_ce: 0.2714 decode.acc_seg: 87.6505 aux.loss_ce: 0.1548 aux.acc_seg: 85.5944 2024/10/26 13:07:29 - mmengine - INFO - Iter(train) [73050/80000] base_lr: 8.7213e-06 lr: 8.7213e-06 eta: 3:46:42 time: 1.9669 data_time: 0.0186 memory: 6645 grad_norm: 3.6458 loss: 0.4442 decode.loss_ce: 0.2937 decode.acc_seg: 86.3860 aux.loss_ce: 0.1505 aux.acc_seg: 76.5863 2024/10/26 13:09:06 - mmengine - INFO - Iter(train) [73100/80000] base_lr: 8.5994e-06 lr: 8.5994e-06 eta: 3:45:04 time: 1.9545 data_time: 0.0195 memory: 6645 grad_norm: 3.9155 loss: 0.4588 decode.loss_ce: 0.3166 decode.acc_seg: 91.1680 aux.loss_ce: 0.1422 aux.acc_seg: 89.7070 2024/10/26 13:10:44 - mmengine - INFO - Iter(train) [73150/80000] base_lr: 8.4782e-06 lr: 8.4782e-06 eta: 3:43:27 time: 1.9519 data_time: 0.0203 memory: 6645 grad_norm: 3.6181 loss: 0.4543 decode.loss_ce: 0.3070 decode.acc_seg: 88.6546 aux.loss_ce: 0.1473 aux.acc_seg: 87.0117 2024/10/26 13:12:22 - mmengine - INFO - Iter(train) [73200/80000] base_lr: 8.3579e-06 lr: 8.3579e-06 eta: 3:41:49 time: 1.9582 data_time: 0.0169 memory: 6645 grad_norm: 3.3935 loss: 0.4407 decode.loss_ce: 0.2864 decode.acc_seg: 80.3172 aux.loss_ce: 0.1544 aux.acc_seg: 76.3759 2024/10/26 13:14:00 - mmengine - INFO - Iter(train) [73250/80000] base_lr: 8.2383e-06 lr: 8.2383e-06 eta: 3:40:11 time: 1.9550 data_time: 0.0190 memory: 6646 grad_norm: 3.2095 loss: 0.4273 decode.loss_ce: 0.2856 decode.acc_seg: 87.2032 aux.loss_ce: 0.1418 aux.acc_seg: 87.6101 2024/10/26 13:15:38 - mmengine - INFO - Iter(train) [73300/80000] base_lr: 8.1196e-06 lr: 8.1196e-06 eta: 3:38:33 time: 1.9491 data_time: 0.0176 memory: 6645 grad_norm: 4.2920 loss: 0.4603 decode.loss_ce: 0.3045 decode.acc_seg: 91.1434 aux.loss_ce: 0.1559 aux.acc_seg: 91.0395 2024/10/26 13:17:16 - mmengine - INFO - Iter(train) [73350/80000] base_lr: 8.0016e-06 lr: 8.0016e-06 eta: 3:36:55 time: 1.9487 data_time: 0.0187 memory: 6646 grad_norm: 2.6517 loss: 0.4348 decode.loss_ce: 0.2898 decode.acc_seg: 87.1668 aux.loss_ce: 0.1450 aux.acc_seg: 85.7806 2024/10/26 13:18:54 - mmengine - INFO - Iter(train) [73400/80000] base_lr: 7.8844e-06 lr: 7.8844e-06 eta: 3:35:17 time: 1.9543 data_time: 0.0178 memory: 6645 grad_norm: 6.4310 loss: 0.4559 decode.loss_ce: 0.2893 decode.acc_seg: 92.0618 aux.loss_ce: 0.1666 aux.acc_seg: 91.6579 2024/10/26 13:20:31 - mmengine - INFO - Iter(train) [73450/80000] base_lr: 7.7681e-06 lr: 7.7681e-06 eta: 3:33:39 time: 1.9511 data_time: 0.0187 memory: 6646 grad_norm: 3.8888 loss: 0.4431 decode.loss_ce: 0.2919 decode.acc_seg: 81.1021 aux.loss_ce: 0.1511 aux.acc_seg: 75.1061 2024/10/26 13:22:09 - mmengine - INFO - Iter(train) [73500/80000] base_lr: 7.6525e-06 lr: 7.6525e-06 eta: 3:32:02 time: 1.9413 data_time: 0.0187 memory: 6645 grad_norm: 3.8058 loss: 0.4652 decode.loss_ce: 0.3068 decode.acc_seg: 91.3078 aux.loss_ce: 0.1584 aux.acc_seg: 90.1481 2024/10/26 13:23:46 - mmengine - INFO - Iter(train) [73550/80000] base_lr: 7.5378e-06 lr: 7.5378e-06 eta: 3:30:24 time: 1.9468 data_time: 0.0183 memory: 6647 grad_norm: 4.4967 loss: 0.4004 decode.loss_ce: 0.2705 decode.acc_seg: 86.3527 aux.loss_ce: 0.1299 aux.acc_seg: 83.7514 2024/10/26 13:25:24 - mmengine - INFO - Iter(train) [73600/80000] base_lr: 7.4239e-06 lr: 7.4239e-06 eta: 3:28:46 time: 1.9508 data_time: 0.0192 memory: 6648 grad_norm: 4.4917 loss: 0.4546 decode.loss_ce: 0.2981 decode.acc_seg: 86.6371 aux.loss_ce: 0.1565 aux.acc_seg: 85.4710 2024/10/26 13:27:02 - mmengine - INFO - Iter(train) [73650/80000] base_lr: 7.3107e-06 lr: 7.3107e-06 eta: 3:27:08 time: 1.9471 data_time: 0.0183 memory: 6645 grad_norm: 3.8002 loss: 0.5328 decode.loss_ce: 0.3524 decode.acc_seg: 82.0308 aux.loss_ce: 0.1804 aux.acc_seg: 80.4748 2024/10/26 13:28:43 - mmengine - INFO - Iter(train) [73700/80000] base_lr: 7.1984e-06 lr: 7.1984e-06 eta: 3:25:30 time: 1.9513 data_time: 0.0190 memory: 6645 grad_norm: 7.1109 loss: 0.5133 decode.loss_ce: 0.3408 decode.acc_seg: 88.5358 aux.loss_ce: 0.1725 aux.acc_seg: 89.3779 2024/10/26 13:30:21 - mmengine - INFO - Iter(train) [73750/80000] base_lr: 7.0869e-06 lr: 7.0869e-06 eta: 3:23:52 time: 1.9488 data_time: 0.0185 memory: 6645 grad_norm: 3.5499 loss: 0.4468 decode.loss_ce: 0.2924 decode.acc_seg: 83.6114 aux.loss_ce: 0.1543 aux.acc_seg: 83.2905 2024/10/26 13:31:59 - mmengine - INFO - Iter(train) [73800/80000] base_lr: 6.9763e-06 lr: 6.9763e-06 eta: 3:22:15 time: 1.9457 data_time: 0.0189 memory: 6645 grad_norm: 3.2551 loss: 0.4619 decode.loss_ce: 0.3115 decode.acc_seg: 79.3074 aux.loss_ce: 0.1504 aux.acc_seg: 77.5147 2024/10/26 13:33:37 - mmengine - INFO - Iter(train) [73850/80000] base_lr: 6.8664e-06 lr: 6.8664e-06 eta: 3:20:37 time: 1.9490 data_time: 0.0177 memory: 6645 grad_norm: 2.9267 loss: 0.4717 decode.loss_ce: 0.3166 decode.acc_seg: 84.7476 aux.loss_ce: 0.1551 aux.acc_seg: 84.6705 2024/10/26 13:35:15 - mmengine - INFO - Iter(train) [73900/80000] base_lr: 6.7574e-06 lr: 6.7574e-06 eta: 3:18:59 time: 1.9565 data_time: 0.0183 memory: 6646 grad_norm: 2.5539 loss: 0.4535 decode.loss_ce: 0.3066 decode.acc_seg: 88.2710 aux.loss_ce: 0.1469 aux.acc_seg: 88.8780 2024/10/26 13:36:53 - mmengine - INFO - Iter(train) [73950/80000] base_lr: 6.6491e-06 lr: 6.6491e-06 eta: 3:17:21 time: 1.9519 data_time: 0.0179 memory: 6647 grad_norm: 3.9857 loss: 0.4202 decode.loss_ce: 0.2743 decode.acc_seg: 87.8275 aux.loss_ce: 0.1459 aux.acc_seg: 86.3649 2024/10/26 13:38:30 - mmengine - INFO - Exp name: deeplabv3_mobilemamba_b4-80k_ade20k-512x512_20241024_212005 2024/10/26 13:38:30 - mmengine - INFO - Iter(train) [74000/80000] base_lr: 6.5417e-06 lr: 6.5417e-06 eta: 3:15:43 time: 1.9528 data_time: 0.0186 memory: 6645 grad_norm: 3.3500 loss: 0.5033 decode.loss_ce: 0.3321 decode.acc_seg: 83.8490 aux.loss_ce: 0.1712 aux.acc_seg: 77.2459 2024/10/26 13:40:08 - mmengine - INFO - Iter(train) [74050/80000] base_lr: 6.4352e-06 lr: 6.4352e-06 eta: 3:14:05 time: 1.9705 data_time: 0.0194 memory: 6647 grad_norm: 2.9306 loss: 0.4542 decode.loss_ce: 0.3004 decode.acc_seg: 88.4204 aux.loss_ce: 0.1539 aux.acc_seg: 87.0385 2024/10/26 13:41:45 - mmengine - INFO - Iter(train) [74100/80000] base_lr: 6.3294e-06 lr: 6.3294e-06 eta: 3:12:27 time: 1.9483 data_time: 0.0180 memory: 6646 grad_norm: 3.4883 loss: 0.5167 decode.loss_ce: 0.3409 decode.acc_seg: 91.7460 aux.loss_ce: 0.1758 aux.acc_seg: 89.0409 2024/10/26 13:43:23 - mmengine - INFO - Iter(train) [74150/80000] base_lr: 6.2245e-06 lr: 6.2245e-06 eta: 3:10:49 time: 1.9507 data_time: 0.0188 memory: 6646 grad_norm: 3.8459 loss: 0.4543 decode.loss_ce: 0.2919 decode.acc_seg: 87.9222 aux.loss_ce: 0.1624 aux.acc_seg: 89.1662 2024/10/26 13:45:01 - mmengine - INFO - Iter(train) [74200/80000] base_lr: 6.1204e-06 lr: 6.1204e-06 eta: 3:09:12 time: 1.9567 data_time: 0.0186 memory: 6646 grad_norm: 3.4145 loss: 0.3885 decode.loss_ce: 0.2606 decode.acc_seg: 90.2751 aux.loss_ce: 0.1279 aux.acc_seg: 90.3561 2024/10/26 13:46:39 - mmengine - INFO - Iter(train) [74250/80000] base_lr: 6.0172e-06 lr: 6.0172e-06 eta: 3:07:34 time: 1.9540 data_time: 0.0181 memory: 6646 grad_norm: 3.2466 loss: 0.3876 decode.loss_ce: 0.2515 decode.acc_seg: 92.9713 aux.loss_ce: 0.1361 aux.acc_seg: 91.1798 2024/10/26 13:48:16 - mmengine - INFO - Iter(train) [74300/80000] base_lr: 5.9147e-06 lr: 5.9147e-06 eta: 3:05:56 time: 1.9553 data_time: 0.0178 memory: 6648 grad_norm: 3.0208 loss: 0.4272 decode.loss_ce: 0.2889 decode.acc_seg: 85.3625 aux.loss_ce: 0.1383 aux.acc_seg: 84.0698 2024/10/26 13:49:54 - mmengine - INFO - Iter(train) [74350/80000] base_lr: 5.8131e-06 lr: 5.8131e-06 eta: 3:04:18 time: 1.9542 data_time: 0.0183 memory: 6646 grad_norm: 3.7532 loss: 0.4535 decode.loss_ce: 0.2964 decode.acc_seg: 93.5298 aux.loss_ce: 0.1571 aux.acc_seg: 92.9845 2024/10/26 13:51:31 - mmengine - INFO - Iter(train) [74400/80000] base_lr: 5.7124e-06 lr: 5.7124e-06 eta: 3:02:40 time: 1.9523 data_time: 0.0196 memory: 6646 grad_norm: 2.8900 loss: 0.3923 decode.loss_ce: 0.2510 decode.acc_seg: 89.0802 aux.loss_ce: 0.1413 aux.acc_seg: 86.2384 2024/10/26 13:53:09 - mmengine - INFO - Iter(train) [74450/80000] base_lr: 5.6125e-06 lr: 5.6125e-06 eta: 3:01:02 time: 1.9630 data_time: 0.0182 memory: 6645 grad_norm: 3.1529 loss: 0.5227 decode.loss_ce: 0.3393 decode.acc_seg: 85.0019 aux.loss_ce: 0.1835 aux.acc_seg: 84.1324 2024/10/26 13:54:47 - mmengine - INFO - Iter(train) [74500/80000] base_lr: 5.5134e-06 lr: 5.5134e-06 eta: 2:59:24 time: 1.9508 data_time: 0.0179 memory: 6645 grad_norm: 3.3919 loss: 0.4719 decode.loss_ce: 0.3104 decode.acc_seg: 84.8050 aux.loss_ce: 0.1615 aux.acc_seg: 85.3074 2024/10/26 13:56:25 - mmengine - INFO - Iter(train) [74550/80000] base_lr: 5.4151e-06 lr: 5.4151e-06 eta: 2:57:46 time: 1.9508 data_time: 0.0195 memory: 6645 grad_norm: 2.3503 loss: 0.4012 decode.loss_ce: 0.2670 decode.acc_seg: 87.0211 aux.loss_ce: 0.1343 aux.acc_seg: 87.0350 2024/10/26 13:58:03 - mmengine - INFO - Iter(train) [74600/80000] base_lr: 5.3177e-06 lr: 5.3177e-06 eta: 2:56:09 time: 1.9438 data_time: 0.0185 memory: 6645 grad_norm: 4.7178 loss: 0.5302 decode.loss_ce: 0.3448 decode.acc_seg: 84.4667 aux.loss_ce: 0.1854 aux.acc_seg: 80.1046 2024/10/26 13:59:42 - mmengine - INFO - Iter(train) [74650/80000] base_lr: 5.2212e-06 lr: 5.2212e-06 eta: 2:54:31 time: 1.9527 data_time: 0.0178 memory: 6645 grad_norm: 3.4284 loss: 0.4600 decode.loss_ce: 0.3003 decode.acc_seg: 88.1898 aux.loss_ce: 0.1598 aux.acc_seg: 87.3403 2024/10/26 14:01:21 - mmengine - INFO - Iter(train) [74700/80000] base_lr: 5.1255e-06 lr: 5.1255e-06 eta: 2:52:53 time: 1.9700 data_time: 0.0187 memory: 6648 grad_norm: 3.4113 loss: 0.4606 decode.loss_ce: 0.3186 decode.acc_seg: 85.4988 aux.loss_ce: 0.1419 aux.acc_seg: 85.0325 2024/10/26 14:02:58 - mmengine - INFO - Iter(train) [74750/80000] base_lr: 5.0306e-06 lr: 5.0306e-06 eta: 2:51:15 time: 1.9490 data_time: 0.0168 memory: 6645 grad_norm: 4.1112 loss: 0.4976 decode.loss_ce: 0.3301 decode.acc_seg: 89.9022 aux.loss_ce: 0.1675 aux.acc_seg: 90.3352 2024/10/26 14:04:36 - mmengine - INFO - Iter(train) [74800/80000] base_lr: 4.9366e-06 lr: 4.9366e-06 eta: 2:49:37 time: 1.9460 data_time: 0.0184 memory: 6645 grad_norm: 4.5011 loss: 0.3776 decode.loss_ce: 0.2591 decode.acc_seg: 91.2419 aux.loss_ce: 0.1185 aux.acc_seg: 91.6392 2024/10/26 14:06:14 - mmengine - INFO - Iter(train) [74850/80000] base_lr: 4.8434e-06 lr: 4.8434e-06 eta: 2:47:59 time: 1.9536 data_time: 0.0191 memory: 6645 grad_norm: 3.3437 loss: 0.4232 decode.loss_ce: 0.2843 decode.acc_seg: 89.5530 aux.loss_ce: 0.1389 aux.acc_seg: 89.6582 2024/10/26 14:07:51 - mmengine - INFO - Iter(train) [74900/80000] base_lr: 4.7511e-06 lr: 4.7511e-06 eta: 2:46:22 time: 1.9450 data_time: 0.0190 memory: 6646 grad_norm: 3.6654 loss: 0.5176 decode.loss_ce: 0.3367 decode.acc_seg: 93.6763 aux.loss_ce: 0.1809 aux.acc_seg: 85.1241 2024/10/26 14:09:29 - mmengine - INFO - Iter(train) [74950/80000] base_lr: 4.6596e-06 lr: 4.6596e-06 eta: 2:44:44 time: 1.9458 data_time: 0.0179 memory: 6646 grad_norm: 4.5679 loss: 0.4339 decode.loss_ce: 0.2881 decode.acc_seg: 92.7653 aux.loss_ce: 0.1458 aux.acc_seg: 92.4400 2024/10/26 14:11:07 - mmengine - INFO - Exp name: deeplabv3_mobilemamba_b4-80k_ade20k-512x512_20241024_212005 2024/10/26 14:11:07 - mmengine - INFO - Iter(train) [75000/80000] base_lr: 4.5690e-06 lr: 4.5690e-06 eta: 2:43:06 time: 1.9431 data_time: 0.0178 memory: 6646 grad_norm: 4.6080 loss: 0.4206 decode.loss_ce: 0.2687 decode.acc_seg: 89.2746 aux.loss_ce: 0.1519 aux.acc_seg: 84.7545 2024/10/26 14:12:45 - mmengine - INFO - Iter(train) [75050/80000] base_lr: 4.4793e-06 lr: 4.4793e-06 eta: 2:41:28 time: 1.9579 data_time: 0.0171 memory: 6647 grad_norm: 3.9455 loss: 0.4124 decode.loss_ce: 0.2701 decode.acc_seg: 80.1025 aux.loss_ce: 0.1423 aux.acc_seg: 78.1791 2024/10/26 14:14:22 - mmengine - INFO - Iter(train) [75100/80000] base_lr: 4.3904e-06 lr: 4.3904e-06 eta: 2:39:50 time: 1.9564 data_time: 0.0192 memory: 6646 grad_norm: 4.1183 loss: 0.4147 decode.loss_ce: 0.2679 decode.acc_seg: 92.6227 aux.loss_ce: 0.1469 aux.acc_seg: 84.9178 2024/10/26 14:16:00 - mmengine - INFO - Iter(train) [75150/80000] base_lr: 4.3023e-06 lr: 4.3023e-06 eta: 2:38:12 time: 1.9642 data_time: 0.0182 memory: 6645 grad_norm: 3.3058 loss: 0.4580 decode.loss_ce: 0.3036 decode.acc_seg: 91.2558 aux.loss_ce: 0.1544 aux.acc_seg: 89.8412 2024/10/26 14:17:38 - mmengine - INFO - Iter(train) [75200/80000] base_lr: 4.2151e-06 lr: 4.2151e-06 eta: 2:36:34 time: 1.9447 data_time: 0.0186 memory: 6646 grad_norm: 3.8248 loss: 0.4680 decode.loss_ce: 0.3153 decode.acc_seg: 89.4750 aux.loss_ce: 0.1527 aux.acc_seg: 87.8003 2024/10/26 14:19:15 - mmengine - INFO - Iter(train) [75250/80000] base_lr: 4.1288e-06 lr: 4.1288e-06 eta: 2:34:56 time: 1.9575 data_time: 0.0200 memory: 6646 grad_norm: 2.6698 loss: 0.4210 decode.loss_ce: 0.2808 decode.acc_seg: 91.2925 aux.loss_ce: 0.1402 aux.acc_seg: 87.6551 2024/10/26 14:20:53 - mmengine - INFO - Iter(train) [75300/80000] base_lr: 4.0434e-06 lr: 4.0434e-06 eta: 2:33:19 time: 1.9639 data_time: 0.0181 memory: 6646 grad_norm: 3.4417 loss: 0.3731 decode.loss_ce: 0.2551 decode.acc_seg: 91.5554 aux.loss_ce: 0.1179 aux.acc_seg: 91.9929 2024/10/26 14:22:31 - mmengine - INFO - Iter(train) [75350/80000] base_lr: 3.9588e-06 lr: 3.9588e-06 eta: 2:31:41 time: 1.9578 data_time: 0.0180 memory: 6646 grad_norm: 4.1836 loss: 0.3639 decode.loss_ce: 0.2397 decode.acc_seg: 91.2475 aux.loss_ce: 0.1242 aux.acc_seg: 90.0961 2024/10/26 14:24:09 - mmengine - INFO - Iter(train) [75400/80000] base_lr: 3.8750e-06 lr: 3.8750e-06 eta: 2:30:03 time: 1.9548 data_time: 0.0197 memory: 6645 grad_norm: 3.2707 loss: 0.4120 decode.loss_ce: 0.2671 decode.acc_seg: 87.8212 aux.loss_ce: 0.1449 aux.acc_seg: 87.4491 2024/10/26 14:25:47 - mmengine - INFO - Iter(train) [75450/80000] base_lr: 3.7922e-06 lr: 3.7922e-06 eta: 2:28:25 time: 1.9491 data_time: 0.0195 memory: 6646 grad_norm: 3.8750 loss: 0.4672 decode.loss_ce: 0.3119 decode.acc_seg: 91.7909 aux.loss_ce: 0.1554 aux.acc_seg: 90.4523 2024/10/26 14:27:24 - mmengine - INFO - Iter(train) [75500/80000] base_lr: 3.7102e-06 lr: 3.7102e-06 eta: 2:26:47 time: 1.9650 data_time: 0.0180 memory: 6645 grad_norm: 3.5729 loss: 0.5145 decode.loss_ce: 0.3461 decode.acc_seg: 81.1921 aux.loss_ce: 0.1684 aux.acc_seg: 82.0118 2024/10/26 14:29:02 - mmengine - INFO - Iter(train) [75550/80000] base_lr: 3.6290e-06 lr: 3.6290e-06 eta: 2:25:09 time: 1.9550 data_time: 0.0170 memory: 6645 grad_norm: 3.7089 loss: 0.4276 decode.loss_ce: 0.2824 decode.acc_seg: 86.7901 aux.loss_ce: 0.1452 aux.acc_seg: 84.0923 2024/10/26 14:30:43 - mmengine - INFO - Iter(train) [75600/80000] base_lr: 3.5488e-06 lr: 3.5488e-06 eta: 2:23:32 time: 1.9512 data_time: 0.0185 memory: 6646 grad_norm: 2.7776 loss: 0.4346 decode.loss_ce: 0.2923 decode.acc_seg: 87.6027 aux.loss_ce: 0.1423 aux.acc_seg: 85.1161 2024/10/26 14:32:21 - mmengine - INFO - Iter(train) [75650/80000] base_lr: 3.4694e-06 lr: 3.4694e-06 eta: 2:21:54 time: 1.9660 data_time: 0.0180 memory: 6646 grad_norm: 3.5696 loss: 0.4410 decode.loss_ce: 0.2961 decode.acc_seg: 83.9894 aux.loss_ce: 0.1449 aux.acc_seg: 82.2894 2024/10/26 14:33:59 - mmengine - INFO - Iter(train) [75700/80000] base_lr: 3.3908e-06 lr: 3.3908e-06 eta: 2:20:16 time: 1.9544 data_time: 0.0186 memory: 6645 grad_norm: 3.5647 loss: 0.3859 decode.loss_ce: 0.2479 decode.acc_seg: 86.8380 aux.loss_ce: 0.1380 aux.acc_seg: 85.4417 2024/10/26 14:35:37 - mmengine - INFO - Iter(train) [75750/80000] base_lr: 3.3132e-06 lr: 3.3132e-06 eta: 2:18:38 time: 1.9544 data_time: 0.0181 memory: 6645 grad_norm: 3.8660 loss: 0.4421 decode.loss_ce: 0.2969 decode.acc_seg: 85.9905 aux.loss_ce: 0.1452 aux.acc_seg: 85.9540 2024/10/26 14:37:15 - mmengine - INFO - Iter(train) [75800/80000] base_lr: 3.2364e-06 lr: 3.2364e-06 eta: 2:17:00 time: 1.9469 data_time: 0.0193 memory: 6646 grad_norm: 3.4074 loss: 0.4228 decode.loss_ce: 0.2900 decode.acc_seg: 87.7391 aux.loss_ce: 0.1328 aux.acc_seg: 87.4448 2024/10/26 14:38:52 - mmengine - INFO - Iter(train) [75850/80000] base_lr: 3.1605e-06 lr: 3.1605e-06 eta: 2:15:22 time: 1.9473 data_time: 0.0182 memory: 6646 grad_norm: 3.3075 loss: 0.4714 decode.loss_ce: 0.3176 decode.acc_seg: 89.2807 aux.loss_ce: 0.1538 aux.acc_seg: 81.7854 2024/10/26 14:40:30 - mmengine - INFO - Iter(train) [75900/80000] base_lr: 3.0855e-06 lr: 3.0855e-06 eta: 2:13:44 time: 1.9434 data_time: 0.0193 memory: 6646 grad_norm: 4.0140 loss: 0.4825 decode.loss_ce: 0.3125 decode.acc_seg: 86.3505 aux.loss_ce: 0.1700 aux.acc_seg: 84.5720 2024/10/26 14:42:08 - mmengine - INFO - Iter(train) [75950/80000] base_lr: 3.0113e-06 lr: 3.0113e-06 eta: 2:12:06 time: 1.9486 data_time: 0.0197 memory: 6645 grad_norm: 3.2473 loss: 0.4554 decode.loss_ce: 0.3064 decode.acc_seg: 86.8602 aux.loss_ce: 0.1489 aux.acc_seg: 80.6506 2024/10/26 14:43:46 - mmengine - INFO - Exp name: deeplabv3_mobilemamba_b4-80k_ade20k-512x512_20241024_212005 2024/10/26 14:43:46 - mmengine - INFO - Iter(train) [76000/80000] base_lr: 2.9381e-06 lr: 2.9381e-06 eta: 2:10:29 time: 1.9434 data_time: 0.0190 memory: 6645 grad_norm: 3.2004 loss: 0.4397 decode.loss_ce: 0.2865 decode.acc_seg: 90.6794 aux.loss_ce: 0.1532 aux.acc_seg: 84.6785 2024/10/26 14:45:24 - mmengine - INFO - Iter(train) [76050/80000] base_lr: 2.8657e-06 lr: 2.8657e-06 eta: 2:08:51 time: 1.9469 data_time: 0.0195 memory: 6645 grad_norm: 3.8686 loss: 0.4051 decode.loss_ce: 0.2641 decode.acc_seg: 92.8303 aux.loss_ce: 0.1410 aux.acc_seg: 91.0377 2024/10/26 14:47:01 - mmengine - INFO - Iter(train) [76100/80000] base_lr: 2.7942e-06 lr: 2.7942e-06 eta: 2:07:13 time: 1.9634 data_time: 0.0193 memory: 6646 grad_norm: 2.8313 loss: 0.4471 decode.loss_ce: 0.2988 decode.acc_seg: 86.8148 aux.loss_ce: 0.1483 aux.acc_seg: 85.2829 2024/10/26 14:48:43 - mmengine - INFO - Iter(train) [76150/80000] base_lr: 2.7235e-06 lr: 2.7235e-06 eta: 2:05:35 time: 1.9505 data_time: 0.0191 memory: 6645 grad_norm: 3.1492 loss: 0.3929 decode.loss_ce: 0.2647 decode.acc_seg: 85.0462 aux.loss_ce: 0.1283 aux.acc_seg: 82.7224 2024/10/26 14:50:20 - mmengine - INFO - Iter(train) [76200/80000] base_lr: 2.6538e-06 lr: 2.6538e-06 eta: 2:03:57 time: 1.9452 data_time: 0.0191 memory: 6645 grad_norm: 5.0201 loss: 0.3989 decode.loss_ce: 0.2614 decode.acc_seg: 86.6297 aux.loss_ce: 0.1375 aux.acc_seg: 86.8048 2024/10/26 14:51:58 - mmengine - INFO - Iter(train) [76250/80000] base_lr: 2.5849e-06 lr: 2.5849e-06 eta: 2:02:19 time: 1.9638 data_time: 0.0192 memory: 6646 grad_norm: 3.7628 loss: 0.4213 decode.loss_ce: 0.2843 decode.acc_seg: 88.3963 aux.loss_ce: 0.1370 aux.acc_seg: 82.8242 2024/10/26 14:53:36 - mmengine - INFO - Iter(train) [76300/80000] base_lr: 2.5170e-06 lr: 2.5170e-06 eta: 2:00:42 time: 1.9523 data_time: 0.0186 memory: 6645 grad_norm: 4.1160 loss: 0.5158 decode.loss_ce: 0.3328 decode.acc_seg: 82.3674 aux.loss_ce: 0.1829 aux.acc_seg: 77.1737 2024/10/26 14:55:13 - mmengine - INFO - Iter(train) [76350/80000] base_lr: 2.4499e-06 lr: 2.4499e-06 eta: 1:59:04 time: 1.9503 data_time: 0.0195 memory: 6645 grad_norm: 2.5646 loss: 0.4653 decode.loss_ce: 0.3120 decode.acc_seg: 88.2643 aux.loss_ce: 0.1532 aux.acc_seg: 87.7713 2024/10/26 14:56:51 - mmengine - INFO - Iter(train) [76400/80000] base_lr: 2.3837e-06 lr: 2.3837e-06 eta: 1:57:26 time: 1.9537 data_time: 0.0203 memory: 6645 grad_norm: 4.4505 loss: 0.3762 decode.loss_ce: 0.2431 decode.acc_seg: 87.8642 aux.loss_ce: 0.1331 aux.acc_seg: 84.3509 2024/10/26 14:58:29 - mmengine - INFO - Iter(train) [76450/80000] base_lr: 2.3184e-06 lr: 2.3184e-06 eta: 1:55:48 time: 1.9682 data_time: 0.0178 memory: 6646 grad_norm: 3.1963 loss: 0.4122 decode.loss_ce: 0.2775 decode.acc_seg: 81.3033 aux.loss_ce: 0.1347 aux.acc_seg: 80.2548 2024/10/26 15:00:06 - mmengine - INFO - Iter(train) [76500/80000] base_lr: 2.2540e-06 lr: 2.2540e-06 eta: 1:54:10 time: 1.9569 data_time: 0.0198 memory: 6646 grad_norm: 2.7426 loss: 0.4388 decode.loss_ce: 0.2858 decode.acc_seg: 82.4620 aux.loss_ce: 0.1531 aux.acc_seg: 82.1336 2024/10/26 15:01:45 - mmengine - INFO - Iter(train) [76550/80000] base_lr: 2.1904e-06 lr: 2.1904e-06 eta: 1:52:32 time: 1.9490 data_time: 0.0184 memory: 6646 grad_norm: 6.2441 loss: 0.4679 decode.loss_ce: 0.3028 decode.acc_seg: 87.7124 aux.loss_ce: 0.1651 aux.acc_seg: 88.0222 2024/10/26 15:03:22 - mmengine - INFO - Iter(train) [76600/80000] base_lr: 2.1278e-06 lr: 2.1278e-06 eta: 1:50:54 time: 1.9499 data_time: 0.0193 memory: 6645 grad_norm: 2.4464 loss: 0.4349 decode.loss_ce: 0.2821 decode.acc_seg: 89.3387 aux.loss_ce: 0.1528 aux.acc_seg: 90.3325 2024/10/26 15:05:00 - mmengine - INFO - Iter(train) [76650/80000] base_lr: 2.0661e-06 lr: 2.0661e-06 eta: 1:49:16 time: 1.9470 data_time: 0.0194 memory: 6645 grad_norm: 5.1765 loss: 0.4907 decode.loss_ce: 0.3226 decode.acc_seg: 87.9170 aux.loss_ce: 0.1682 aux.acc_seg: 86.7205 2024/10/26 15:06:38 - mmengine - INFO - Iter(train) [76700/80000] base_lr: 2.0052e-06 lr: 2.0052e-06 eta: 1:47:39 time: 1.9528 data_time: 0.0202 memory: 6645 grad_norm: 2.9466 loss: 0.3878 decode.loss_ce: 0.2548 decode.acc_seg: 88.5985 aux.loss_ce: 0.1330 aux.acc_seg: 88.1043 2024/10/26 15:08:15 - mmengine - INFO - Iter(train) [76750/80000] base_lr: 1.9452e-06 lr: 1.9452e-06 eta: 1:46:01 time: 1.9488 data_time: 0.0184 memory: 6645 grad_norm: 3.0010 loss: 0.3848 decode.loss_ce: 0.2507 decode.acc_seg: 88.2839 aux.loss_ce: 0.1341 aux.acc_seg: 86.1514 2024/10/26 15:09:54 - mmengine - INFO - Iter(train) [76800/80000] base_lr: 1.8862e-06 lr: 1.8862e-06 eta: 1:44:23 time: 1.9644 data_time: 0.0177 memory: 6645 grad_norm: 3.2888 loss: 0.4551 decode.loss_ce: 0.2924 decode.acc_seg: 90.5916 aux.loss_ce: 0.1627 aux.acc_seg: 80.8035 2024/10/26 15:11:32 - mmengine - INFO - Iter(train) [76850/80000] base_lr: 1.8280e-06 lr: 1.8280e-06 eta: 1:42:45 time: 1.9474 data_time: 0.0181 memory: 6646 grad_norm: 3.2854 loss: 0.4202 decode.loss_ce: 0.2852 decode.acc_seg: 89.3116 aux.loss_ce: 0.1350 aux.acc_seg: 87.8232 2024/10/26 15:13:09 - mmengine - INFO - Iter(train) [76900/80000] base_lr: 1.7707e-06 lr: 1.7707e-06 eta: 1:41:07 time: 1.9488 data_time: 0.0195 memory: 6645 grad_norm: 2.4959 loss: 0.4266 decode.loss_ce: 0.2808 decode.acc_seg: 84.9902 aux.loss_ce: 0.1458 aux.acc_seg: 83.1680 2024/10/26 15:14:47 - mmengine - INFO - Iter(train) [76950/80000] base_lr: 1.7144e-06 lr: 1.7144e-06 eta: 1:39:29 time: 1.9493 data_time: 0.0188 memory: 6645 grad_norm: 3.9971 loss: 0.4754 decode.loss_ce: 0.2964 decode.acc_seg: 86.1788 aux.loss_ce: 0.1789 aux.acc_seg: 81.6042 2024/10/26 15:16:24 - mmengine - INFO - Exp name: deeplabv3_mobilemamba_b4-80k_ade20k-512x512_20241024_212005 2024/10/26 15:16:24 - mmengine - INFO - Iter(train) [77000/80000] base_lr: 1.6589e-06 lr: 1.6589e-06 eta: 1:37:51 time: 1.9500 data_time: 0.0180 memory: 6647 grad_norm: 5.5876 loss: 0.5287 decode.loss_ce: 0.3499 decode.acc_seg: 86.9287 aux.loss_ce: 0.1788 aux.acc_seg: 84.7042 2024/10/26 15:18:02 - mmengine - INFO - Iter(train) [77050/80000] base_lr: 1.6043e-06 lr: 1.6043e-06 eta: 1:36:13 time: 1.9657 data_time: 0.0180 memory: 6645 grad_norm: 2.6562 loss: 0.4273 decode.loss_ce: 0.2865 decode.acc_seg: 86.4505 aux.loss_ce: 0.1408 aux.acc_seg: 85.1550 2024/10/26 15:19:45 - mmengine - INFO - Iter(train) [77100/80000] base_lr: 1.5507e-06 lr: 1.5507e-06 eta: 1:34:36 time: 1.9523 data_time: 0.0191 memory: 6645 grad_norm: 3.4643 loss: 0.4447 decode.loss_ce: 0.2851 decode.acc_seg: 89.4665 aux.loss_ce: 0.1596 aux.acc_seg: 73.0372 2024/10/26 15:21:22 - mmengine - INFO - Iter(train) [77150/80000] base_lr: 1.4979e-06 lr: 1.4979e-06 eta: 1:32:58 time: 1.9481 data_time: 0.0187 memory: 6645 grad_norm: 3.6111 loss: 0.4703 decode.loss_ce: 0.3073 decode.acc_seg: 88.2839 aux.loss_ce: 0.1629 aux.acc_seg: 84.5970 2024/10/26 15:23:00 - mmengine - INFO - Iter(train) [77200/80000] base_lr: 1.4460e-06 lr: 1.4460e-06 eta: 1:31:20 time: 1.9457 data_time: 0.0190 memory: 6646 grad_norm: 2.7925 loss: 0.4506 decode.loss_ce: 0.3046 decode.acc_seg: 91.9395 aux.loss_ce: 0.1460 aux.acc_seg: 92.2150 2024/10/26 15:24:38 - mmengine - INFO - Iter(train) [77250/80000] base_lr: 1.3951e-06 lr: 1.3951e-06 eta: 1:29:42 time: 1.9388 data_time: 0.0192 memory: 6645 grad_norm: 6.5334 loss: 0.4498 decode.loss_ce: 0.2894 decode.acc_seg: 90.6900 aux.loss_ce: 0.1603 aux.acc_seg: 89.9244 2024/10/26 15:26:15 - mmengine - INFO - Iter(train) [77300/80000] base_lr: 1.3450e-06 lr: 1.3450e-06 eta: 1:28:04 time: 1.9513 data_time: 0.0178 memory: 6646 grad_norm: 4.4183 loss: 0.4106 decode.loss_ce: 0.2780 decode.acc_seg: 88.2318 aux.loss_ce: 0.1326 aux.acc_seg: 78.5769 2024/10/26 15:27:53 - mmengine - INFO - Iter(train) [77350/80000] base_lr: 1.2958e-06 lr: 1.2958e-06 eta: 1:26:26 time: 1.9444 data_time: 0.0189 memory: 6645 grad_norm: 3.1086 loss: 0.3775 decode.loss_ce: 0.2515 decode.acc_seg: 87.6161 aux.loss_ce: 0.1261 aux.acc_seg: 82.0585 2024/10/26 15:29:31 - mmengine - INFO - Iter(train) [77400/80000] base_lr: 1.2476e-06 lr: 1.2476e-06 eta: 1:24:49 time: 1.9463 data_time: 0.0186 memory: 6646 grad_norm: 2.8698 loss: 0.5002 decode.loss_ce: 0.3355 decode.acc_seg: 85.1761 aux.loss_ce: 0.1647 aux.acc_seg: 84.0469 2024/10/26 15:31:09 - mmengine - INFO - Iter(train) [77450/80000] base_lr: 1.2002e-06 lr: 1.2002e-06 eta: 1:23:11 time: 1.9490 data_time: 0.0172 memory: 6646 grad_norm: 3.0986 loss: 0.4565 decode.loss_ce: 0.3054 decode.acc_seg: 85.7003 aux.loss_ce: 0.1511 aux.acc_seg: 80.7521 2024/10/26 15:32:47 - mmengine - INFO - Iter(train) [77500/80000] base_lr: 1.1538e-06 lr: 1.1538e-06 eta: 1:21:33 time: 1.9502 data_time: 0.0181 memory: 6646 grad_norm: 4.4447 loss: 0.4291 decode.loss_ce: 0.2936 decode.acc_seg: 74.2563 aux.loss_ce: 0.1355 aux.acc_seg: 77.3470 2024/10/26 15:34:24 - mmengine - INFO - Iter(train) [77550/80000] base_lr: 1.1083e-06 lr: 1.1083e-06 eta: 1:19:55 time: 1.9527 data_time: 0.0180 memory: 6645 grad_norm: 3.5308 loss: 0.4714 decode.loss_ce: 0.3137 decode.acc_seg: 89.3288 aux.loss_ce: 0.1577 aux.acc_seg: 86.3671 2024/10/26 15:36:02 - mmengine - INFO - Iter(train) [77600/80000] base_lr: 1.0636e-06 lr: 1.0636e-06 eta: 1:18:17 time: 1.9532 data_time: 0.0194 memory: 6646 grad_norm: 4.2702 loss: 0.4742 decode.loss_ce: 0.3031 decode.acc_seg: 79.2278 aux.loss_ce: 0.1711 aux.acc_seg: 68.9795 2024/10/26 15:37:43 - mmengine - INFO - Iter(train) [77650/80000] base_lr: 1.0199e-06 lr: 1.0199e-06 eta: 1:16:39 time: 1.9421 data_time: 0.0179 memory: 6646 grad_norm: 4.5262 loss: 0.4303 decode.loss_ce: 0.2834 decode.acc_seg: 89.2031 aux.loss_ce: 0.1469 aux.acc_seg: 88.7906 2024/10/26 15:39:20 - mmengine - INFO - Iter(train) [77700/80000] base_lr: 9.7713e-07 lr: 9.7713e-07 eta: 1:15:01 time: 1.9484 data_time: 0.0193 memory: 6645 grad_norm: 4.0251 loss: 0.4582 decode.loss_ce: 0.2987 decode.acc_seg: 90.6632 aux.loss_ce: 0.1595 aux.acc_seg: 90.6267 2024/10/26 15:40:58 - mmengine - INFO - Iter(train) [77750/80000] base_lr: 9.3523e-07 lr: 9.3523e-07 eta: 1:13:24 time: 1.9487 data_time: 0.0194 memory: 6645 grad_norm: 3.2350 loss: 0.4525 decode.loss_ce: 0.2936 decode.acc_seg: 81.2161 aux.loss_ce: 0.1589 aux.acc_seg: 73.7330 2024/10/26 15:42:35 - mmengine - INFO - Iter(train) [77800/80000] base_lr: 8.9425e-07 lr: 8.9425e-07 eta: 1:11:46 time: 1.9499 data_time: 0.0193 memory: 6645 grad_norm: 3.2187 loss: 0.4761 decode.loss_ce: 0.3186 decode.acc_seg: 79.5636 aux.loss_ce: 0.1576 aux.acc_seg: 78.1279 2024/10/26 15:44:13 - mmengine - INFO - Iter(train) [77850/80000] base_lr: 8.5418e-07 lr: 8.5418e-07 eta: 1:10:08 time: 1.9520 data_time: 0.0196 memory: 6645 grad_norm: 3.7608 loss: 0.4011 decode.loss_ce: 0.2650 decode.acc_seg: 89.8329 aux.loss_ce: 0.1362 aux.acc_seg: 89.2550 2024/10/26 15:45:51 - mmengine - INFO - Iter(train) [77900/80000] base_lr: 8.1502e-07 lr: 8.1502e-07 eta: 1:08:30 time: 1.9446 data_time: 0.0197 memory: 6646 grad_norm: 2.6542 loss: 0.4851 decode.loss_ce: 0.3221 decode.acc_seg: 88.1534 aux.loss_ce: 0.1630 aux.acc_seg: 87.2334 2024/10/26 15:47:29 - mmengine - INFO - Iter(train) [77950/80000] base_lr: 7.7677e-07 lr: 7.7677e-07 eta: 1:06:52 time: 1.9594 data_time: 0.0202 memory: 6645 grad_norm: 3.9341 loss: 0.4125 decode.loss_ce: 0.2642 decode.acc_seg: 89.2936 aux.loss_ce: 0.1483 aux.acc_seg: 86.4892 2024/10/26 15:49:07 - mmengine - INFO - Exp name: deeplabv3_mobilemamba_b4-80k_ade20k-512x512_20241024_212005 2024/10/26 15:49:07 - mmengine - INFO - Iter(train) [78000/80000] base_lr: 7.3944e-07 lr: 7.3944e-07 eta: 1:05:14 time: 1.9468 data_time: 0.0191 memory: 6645 grad_norm: 3.1573 loss: 0.3808 decode.loss_ce: 0.2441 decode.acc_seg: 91.0677 aux.loss_ce: 0.1367 aux.acc_seg: 89.7979 2024/10/26 15:50:45 - mmengine - INFO - Iter(train) [78050/80000] base_lr: 7.0302e-07 lr: 7.0302e-07 eta: 1:03:36 time: 1.9499 data_time: 0.0192 memory: 6646 grad_norm: 3.1993 loss: 0.4033 decode.loss_ce: 0.2667 decode.acc_seg: 87.1350 aux.loss_ce: 0.1366 aux.acc_seg: 86.5789 2024/10/26 15:52:22 - mmengine - INFO - Iter(train) [78100/80000] base_lr: 6.6751e-07 lr: 6.6751e-07 eta: 1:01:58 time: 1.9535 data_time: 0.0180 memory: 6645 grad_norm: 3.0387 loss: 0.4543 decode.loss_ce: 0.2957 decode.acc_seg: 81.9293 aux.loss_ce: 0.1586 aux.acc_seg: 77.4491 2024/10/26 15:54:00 - mmengine - INFO - Iter(train) [78150/80000] base_lr: 6.3292e-07 lr: 6.3292e-07 eta: 1:00:21 time: 1.9520 data_time: 0.0191 memory: 6645 grad_norm: 3.3172 loss: 0.4123 decode.loss_ce: 0.2684 decode.acc_seg: 84.4305 aux.loss_ce: 0.1439 aux.acc_seg: 81.1270 2024/10/26 15:55:38 - mmengine - INFO - Iter(train) [78200/80000] base_lr: 5.9924e-07 lr: 5.9924e-07 eta: 0:58:43 time: 1.9438 data_time: 0.0186 memory: 6644 grad_norm: 3.5785 loss: 0.4686 decode.loss_ce: 0.3056 decode.acc_seg: 86.9200 aux.loss_ce: 0.1630 aux.acc_seg: 75.5448 2024/10/26 15:57:16 - mmengine - INFO - Iter(train) [78250/80000] base_lr: 5.6649e-07 lr: 5.6649e-07 eta: 0:57:05 time: 1.9506 data_time: 0.0181 memory: 6645 grad_norm: 3.2662 loss: 0.4244 decode.loss_ce: 0.2836 decode.acc_seg: 92.8604 aux.loss_ce: 0.1408 aux.acc_seg: 92.7356 2024/10/26 15:58:54 - mmengine - INFO - Iter(train) [78300/80000] base_lr: 5.3464e-07 lr: 5.3464e-07 eta: 0:55:27 time: 1.9588 data_time: 0.0195 memory: 6645 grad_norm: 2.9966 loss: 0.4380 decode.loss_ce: 0.2884 decode.acc_seg: 82.3069 aux.loss_ce: 0.1495 aux.acc_seg: 81.7402 2024/10/26 16:00:32 - mmengine - INFO - Iter(train) [78350/80000] base_lr: 5.0372e-07 lr: 5.0372e-07 eta: 0:53:49 time: 1.9473 data_time: 0.0192 memory: 6646 grad_norm: 4.2301 loss: 0.4910 decode.loss_ce: 0.3175 decode.acc_seg: 86.7561 aux.loss_ce: 0.1734 aux.acc_seg: 83.2079 2024/10/26 16:02:10 - mmengine - INFO - Iter(train) [78400/80000] base_lr: 4.7371e-07 lr: 4.7371e-07 eta: 0:52:11 time: 1.9640 data_time: 0.0167 memory: 6646 grad_norm: 3.7603 loss: 0.4561 decode.loss_ce: 0.3080 decode.acc_seg: 88.5524 aux.loss_ce: 0.1481 aux.acc_seg: 88.8907 2024/10/26 16:03:47 - mmengine - INFO - Iter(train) [78450/80000] base_lr: 4.4462e-07 lr: 4.4462e-07 eta: 0:50:33 time: 1.9608 data_time: 0.0172 memory: 6644 grad_norm: 3.8646 loss: 0.4006 decode.loss_ce: 0.2631 decode.acc_seg: 88.1186 aux.loss_ce: 0.1374 aux.acc_seg: 85.2232 2024/10/26 16:05:25 - mmengine - INFO - Iter(train) [78500/80000] base_lr: 4.1645e-07 lr: 4.1645e-07 eta: 0:48:56 time: 1.9487 data_time: 0.0188 memory: 6645 grad_norm: 4.1074 loss: 0.4816 decode.loss_ce: 0.3094 decode.acc_seg: 87.2444 aux.loss_ce: 0.1721 aux.acc_seg: 82.0472 2024/10/26 16:07:02 - mmengine - INFO - Iter(train) [78550/80000] base_lr: 3.8919e-07 lr: 3.8919e-07 eta: 0:47:18 time: 1.9477 data_time: 0.0183 memory: 6645 grad_norm: 3.6067 loss: 0.3949 decode.loss_ce: 0.2646 decode.acc_seg: 92.1272 aux.loss_ce: 0.1303 aux.acc_seg: 92.0558 2024/10/26 16:08:43 - mmengine - INFO - Iter(train) [78600/80000] base_lr: 3.6286e-07 lr: 3.6286e-07 eta: 0:45:40 time: 1.9495 data_time: 0.0196 memory: 6645 grad_norm: 3.5870 loss: 0.4334 decode.loss_ce: 0.2902 decode.acc_seg: 93.5082 aux.loss_ce: 0.1432 aux.acc_seg: 93.9844 2024/10/26 16:10:21 - mmengine - INFO - Iter(train) [78650/80000] base_lr: 3.3745e-07 lr: 3.3745e-07 eta: 0:44:02 time: 1.9511 data_time: 0.0200 memory: 6645 grad_norm: 3.6411 loss: 0.4812 decode.loss_ce: 0.3119 decode.acc_seg: 83.5786 aux.loss_ce: 0.1693 aux.acc_seg: 83.6092 2024/10/26 16:11:59 - mmengine - INFO - Iter(train) [78700/80000] base_lr: 3.1295e-07 lr: 3.1295e-07 eta: 0:42:24 time: 1.9569 data_time: 0.0186 memory: 6645 grad_norm: 2.9800 loss: 0.4039 decode.loss_ce: 0.2704 decode.acc_seg: 86.6467 aux.loss_ce: 0.1335 aux.acc_seg: 86.9191 2024/10/26 16:13:36 - mmengine - INFO - Iter(train) [78750/80000] base_lr: 2.8938e-07 lr: 2.8938e-07 eta: 0:40:46 time: 1.9472 data_time: 0.0178 memory: 6646 grad_norm: 3.4486 loss: 0.5068 decode.loss_ce: 0.3249 decode.acc_seg: 87.6029 aux.loss_ce: 0.1819 aux.acc_seg: 80.7680 2024/10/26 16:15:14 - mmengine - INFO - Iter(train) [78800/80000] base_lr: 2.6673e-07 lr: 2.6673e-07 eta: 0:39:08 time: 1.9507 data_time: 0.0203 memory: 6646 grad_norm: 3.3829 loss: 0.4660 decode.loss_ce: 0.3175 decode.acc_seg: 88.4859 aux.loss_ce: 0.1485 aux.acc_seg: 86.8465 2024/10/26 16:16:52 - mmengine - INFO - Iter(train) [78850/80000] base_lr: 2.4499e-07 lr: 2.4499e-07 eta: 0:37:30 time: 1.9506 data_time: 0.0182 memory: 6646 grad_norm: 4.0805 loss: 0.4248 decode.loss_ce: 0.2876 decode.acc_seg: 87.0667 aux.loss_ce: 0.1372 aux.acc_seg: 87.6552 2024/10/26 16:18:30 - mmengine - INFO - Iter(train) [78900/80000] base_lr: 2.2418e-07 lr: 2.2418e-07 eta: 0:35:53 time: 1.9528 data_time: 0.0178 memory: 6646 grad_norm: 3.1406 loss: 0.3862 decode.loss_ce: 0.2570 decode.acc_seg: 91.5458 aux.loss_ce: 0.1292 aux.acc_seg: 89.7053 2024/10/26 16:20:08 - mmengine - INFO - Iter(train) [78950/80000] base_lr: 2.0430e-07 lr: 2.0430e-07 eta: 0:34:15 time: 1.9626 data_time: 0.0186 memory: 6644 grad_norm: 4.1939 loss: 0.4675 decode.loss_ce: 0.3090 decode.acc_seg: 87.4816 aux.loss_ce: 0.1585 aux.acc_seg: 84.5683 2024/10/26 16:21:45 - mmengine - INFO - Exp name: deeplabv3_mobilemamba_b4-80k_ade20k-512x512_20241024_212005 2024/10/26 16:21:45 - mmengine - INFO - Iter(train) [79000/80000] base_lr: 1.8533e-07 lr: 1.8533e-07 eta: 0:32:37 time: 1.9501 data_time: 0.0188 memory: 6646 grad_norm: 4.0266 loss: 0.4700 decode.loss_ce: 0.3253 decode.acc_seg: 89.1940 aux.loss_ce: 0.1447 aux.acc_seg: 87.4966 2024/10/26 16:23:23 - mmengine - INFO - Iter(train) [79050/80000] base_lr: 1.6729e-07 lr: 1.6729e-07 eta: 0:30:59 time: 1.9523 data_time: 0.0194 memory: 6648 grad_norm: 3.6073 loss: 0.4828 decode.loss_ce: 0.3228 decode.acc_seg: 82.7920 aux.loss_ce: 0.1599 aux.acc_seg: 81.8770 2024/10/26 16:25:01 - mmengine - INFO - Iter(train) [79100/80000] base_lr: 1.5017e-07 lr: 1.5017e-07 eta: 0:29:21 time: 1.9482 data_time: 0.0192 memory: 6647 grad_norm: 3.0321 loss: 0.4379 decode.loss_ce: 0.2975 decode.acc_seg: 86.2244 aux.loss_ce: 0.1404 aux.acc_seg: 86.7845 2024/10/26 16:26:38 - mmengine - INFO - Iter(train) [79150/80000] base_lr: 1.3397e-07 lr: 1.3397e-07 eta: 0:27:43 time: 1.9527 data_time: 0.0175 memory: 6647 grad_norm: 4.9613 loss: 0.4904 decode.loss_ce: 0.3202 decode.acc_seg: 88.3331 aux.loss_ce: 0.1702 aux.acc_seg: 88.2351 2024/10/26 16:28:16 - mmengine - INFO - Iter(train) [79200/80000] base_lr: 1.1869e-07 lr: 1.1869e-07 eta: 0:26:05 time: 1.9601 data_time: 0.0198 memory: 6645 grad_norm: 3.0528 loss: 0.3792 decode.loss_ce: 0.2548 decode.acc_seg: 90.3763 aux.loss_ce: 0.1244 aux.acc_seg: 91.4915 2024/10/26 16:29:54 - mmengine - INFO - Iter(train) [79250/80000] base_lr: 1.0434e-07 lr: 1.0434e-07 eta: 0:24:28 time: 1.9411 data_time: 0.0180 memory: 6645 grad_norm: 3.7446 loss: 0.4007 decode.loss_ce: 0.2683 decode.acc_seg: 86.4241 aux.loss_ce: 0.1323 aux.acc_seg: 85.3628 2024/10/26 16:31:31 - mmengine - INFO - Iter(train) [79300/80000] base_lr: 9.0913e-08 lr: 9.0913e-08 eta: 0:22:50 time: 1.9508 data_time: 0.0174 memory: 6647 grad_norm: 3.8507 loss: 0.4438 decode.loss_ce: 0.2964 decode.acc_seg: 83.5704 aux.loss_ce: 0.1474 aux.acc_seg: 83.6366 2024/10/26 16:33:09 - mmengine - INFO - Iter(train) [79350/80000] base_lr: 7.8409e-08 lr: 7.8409e-08 eta: 0:21:12 time: 1.9715 data_time: 0.0194 memory: 6646 grad_norm: 3.7357 loss: 0.4524 decode.loss_ce: 0.3001 decode.acc_seg: 91.1449 aux.loss_ce: 0.1522 aux.acc_seg: 83.5851 2024/10/26 16:34:47 - mmengine - INFO - Iter(train) [79400/80000] base_lr: 6.6830e-08 lr: 6.6830e-08 eta: 0:19:34 time: 1.9529 data_time: 0.0194 memory: 6645 grad_norm: 7.2207 loss: 0.4376 decode.loss_ce: 0.2933 decode.acc_seg: 77.7766 aux.loss_ce: 0.1443 aux.acc_seg: 71.3390 2024/10/26 16:36:25 - mmengine - INFO - Iter(train) [79450/80000] base_lr: 5.6174e-08 lr: 5.6174e-08 eta: 0:17:56 time: 1.9505 data_time: 0.0188 memory: 6645 grad_norm: 3.3349 loss: 0.4370 decode.loss_ce: 0.2941 decode.acc_seg: 86.4397 aux.loss_ce: 0.1428 aux.acc_seg: 86.9105 2024/10/26 16:38:03 - mmengine - INFO - Iter(train) [79500/80000] base_lr: 4.6443e-08 lr: 4.6443e-08 eta: 0:16:18 time: 1.9666 data_time: 0.0179 memory: 6645 grad_norm: 3.3218 loss: 0.4955 decode.loss_ce: 0.3220 decode.acc_seg: 83.4769 aux.loss_ce: 0.1735 aux.acc_seg: 83.9815 2024/10/26 16:39:43 - mmengine - INFO - Iter(train) [79550/80000] base_lr: 3.7636e-08 lr: 3.7636e-08 eta: 0:14:40 time: 1.9506 data_time: 0.0187 memory: 6645 grad_norm: 2.7430 loss: 0.4583 decode.loss_ce: 0.2926 decode.acc_seg: 89.1527 aux.loss_ce: 0.1657 aux.acc_seg: 87.2737 2024/10/26 16:41:21 - mmengine - INFO - Iter(train) [79600/80000] base_lr: 2.9755e-08 lr: 2.9755e-08 eta: 0:13:02 time: 1.9496 data_time: 0.0175 memory: 6647 grad_norm: 2.9042 loss: 0.4472 decode.loss_ce: 0.3013 decode.acc_seg: 89.7566 aux.loss_ce: 0.1459 aux.acc_seg: 88.9538 2024/10/26 16:42:59 - mmengine - INFO - Iter(train) [79650/80000] base_lr: 2.2798e-08 lr: 2.2798e-08 eta: 0:11:25 time: 1.9474 data_time: 0.0190 memory: 6645 grad_norm: 3.4944 loss: 0.4652 decode.loss_ce: 0.3115 decode.acc_seg: 86.2635 aux.loss_ce: 0.1537 aux.acc_seg: 87.1899 2024/10/26 16:44:37 - mmengine - INFO - Iter(train) [79700/80000] base_lr: 1.6765e-08 lr: 1.6765e-08 eta: 0:09:47 time: 1.9496 data_time: 0.0191 memory: 6645 grad_norm: 4.3415 loss: 0.5069 decode.loss_ce: 0.3351 decode.acc_seg: 81.0198 aux.loss_ce: 0.1718 aux.acc_seg: 77.7200 2024/10/26 16:46:14 - mmengine - INFO - Iter(train) [79750/80000] base_lr: 1.1658e-08 lr: 1.1658e-08 eta: 0:08:09 time: 1.9523 data_time: 0.0196 memory: 6645 grad_norm: 3.6054 loss: 0.5023 decode.loss_ce: 0.3254 decode.acc_seg: 86.1704 aux.loss_ce: 0.1769 aux.acc_seg: 82.4522 2024/10/26 16:47:52 - mmengine - INFO - Iter(train) [79800/80000] base_lr: 7.4763e-09 lr: 7.4763e-09 eta: 0:06:31 time: 1.9527 data_time: 0.0175 memory: 6646 grad_norm: 3.4873 loss: 0.3732 decode.loss_ce: 0.2563 decode.acc_seg: 86.3972 aux.loss_ce: 0.1170 aux.acc_seg: 85.8341 2024/10/26 16:49:29 - mmengine - INFO - Iter(train) [79850/80000] base_lr: 4.2194e-09 lr: 4.2194e-09 eta: 0:04:53 time: 1.9509 data_time: 0.0197 memory: 6646 grad_norm: 3.7381 loss: 0.4317 decode.loss_ce: 0.2898 decode.acc_seg: 89.1390 aux.loss_ce: 0.1418 aux.acc_seg: 90.4500 2024/10/26 16:51:07 - mmengine - INFO - Iter(train) [79900/80000] base_lr: 1.8877e-09 lr: 1.8877e-09 eta: 0:03:15 time: 1.9425 data_time: 0.0184 memory: 6645 grad_norm: 3.6575 loss: 0.4993 decode.loss_ce: 0.3206 decode.acc_seg: 84.4375 aux.loss_ce: 0.1786 aux.acc_seg: 83.1758 2024/10/26 16:52:44 - mmengine - INFO - Iter(train) [79950/80000] base_lr: 4.8133e-10 lr: 4.8133e-10 eta: 0:01:37 time: 1.9687 data_time: 0.0194 memory: 6646 grad_norm: 2.6650 loss: 0.3679 decode.loss_ce: 0.2477 decode.acc_seg: 86.7832 aux.loss_ce: 0.1202 aux.acc_seg: 84.3616 2024/10/26 16:54:22 - mmengine - INFO - Exp name: deeplabv3_mobilemamba_b4-80k_ade20k-512x512_20241024_212005 2024/10/26 16:54:22 - mmengine - INFO - Iter(train) [80000/80000] base_lr: 1.8506e-13 lr: 1.8506e-13 eta: 0:00:00 time: 1.9747 data_time: 0.0183 memory: 6646 grad_norm: 3.6781 loss: 0.4822 decode.loss_ce: 0.3245 decode.acc_seg: 86.1877 aux.loss_ce: 0.1577 aux.acc_seg: 82.9952 2024/10/26 16:54:22 - mmengine - INFO - Saving checkpoint at 80000 iterations 2024/10/26 16:54:26 - mmengine - INFO - Iter(val) [ 50/500] eta: 0:00:15 time: 0.0332 data_time: 0.0017 memory: 1049 2024/10/26 16:54:28 - mmengine - INFO - Iter(val) [100/500] eta: 0:00:13 time: 0.0344 data_time: 0.0016 memory: 1117 2024/10/26 16:54:30 - mmengine - INFO - Iter(val) [150/500] eta: 0:00:12 time: 0.0324 data_time: 0.0016 memory: 833 2024/10/26 16:54:31 - mmengine - INFO - Iter(val) [200/500] eta: 0:00:10 time: 0.0321 data_time: 0.0015 memory: 866 2024/10/26 16:54:33 - mmengine - INFO - Iter(val) [250/500] eta: 0:00:08 time: 0.0323 data_time: 0.0016 memory: 906 2024/10/26 16:54:34 - mmengine - INFO - Iter(val) [300/500] eta: 0:00:06 time: 0.0325 data_time: 0.0019 memory: 2028 2024/10/26 16:54:36 - mmengine - INFO - Iter(val) [350/500] eta: 0:00:05 time: 0.0318 data_time: 0.0016 memory: 832 2024/10/26 16:54:38 - mmengine - INFO - Iter(val) [400/500] eta: 0:00:03 time: 0.0324 data_time: 0.0016 memory: 904 2024/10/26 16:54:39 - mmengine - INFO - Iter(val) [450/500] eta: 0:00:01 time: 0.0351 data_time: 0.0017 memory: 839 2024/10/26 16:54:41 - mmengine - INFO - Iter(val) [500/500] eta: 0:00:00 time: 0.0315 data_time: 0.0014 memory: 889 2024/10/26 16:54:42 - mmengine - INFO - per class results: 2024/10/26 16:54:42 - mmengine - INFO - +---------------------+-------+-------+ | Class | IoU | Acc | +---------------------+-------+-------+ | wall | 67.16 | 84.31 | | building | 76.65 | 90.25 | | sky | 88.64 | 94.47 | | floor | 71.02 | 86.5 | | tree | 65.99 | 83.42 | | ceiling | 77.06 | 87.75 | | road | 77.45 | 87.28 | | bed | 79.64 | 88.61 | | windowpane | 52.21 | 69.72 | | grass | 59.4 | 77.96 | | cabinet | 53.34 | 66.51 | | sidewalk | 56.48 | 72.56 | | person | 59.85 | 78.2 | | earth | 31.01 | 44.28 | | door | 36.76 | 48.4 | | table | 42.66 | 60.74 | | mountain | 53.93 | 68.04 | | plant | 41.14 | 51.83 | | curtain | 57.01 | 72.15 | | chair | 39.02 | 53.22 | | car | 69.1 | 81.93 | | water | 42.69 | 54.85 | | painting | 51.62 | 66.33 | | sofa | 56.74 | 74.63 | | shelf | 30.45 | 43.52 | | house | 44.0 | 54.61 | | sea | 46.16 | 70.91 | | mirror | 53.66 | 61.6 | | rug | 47.91 | 60.05 | | field | 28.84 | 43.38 | | armchair | 34.3 | 50.73 | | seat | 57.06 | 75.97 | | fence | 35.58 | 49.66 | | desk | 39.34 | 54.4 | | rock | 35.76 | 58.93 | | wardrobe | 47.17 | 59.34 | | lamp | 32.86 | 42.79 | | bathtub | 67.81 | 75.26 | | railing | 24.59 | 34.23 | | cushion | 34.26 | 44.93 | | base | 24.42 | 32.91 | | box | 14.17 | 21.94 | | column | 25.76 | 32.7 | | signboard | 18.12 | 24.9 | | chest of drawers | 33.48 | 47.94 | | counter | 24.16 | 30.87 | | sand | 34.08 | 51.36 | | sink | 51.44 | 65.02 | | skyscraper | 40.91 | 53.86 | | fireplace | 61.93 | 75.99 | | refrigerator | 69.6 | 82.25 | | grandstand | 38.46 | 61.11 | | path | 19.11 | 33.17 | | stairs | 18.64 | 24.46 | | runway | 67.6 | 82.45 | | case | 43.07 | 57.51 | | pool table | 76.71 | 83.77 | | pillow | 43.72 | 55.15 | | screen door | 61.75 | 67.12 | | stairway | 18.29 | 28.32 | | river | 10.6 | 23.79 | | bridge | 44.02 | 52.33 | | bookcase | 27.04 | 38.89 | | blind | 29.33 | 33.24 | | coffee table | 47.92 | 64.97 | | toilet | 64.84 | 75.04 | | flower | 23.77 | 37.12 | | book | 31.68 | 50.06 | | hill | 12.45 | 19.27 | | bench | 34.67 | 42.88 | | countertop | 46.78 | 62.91 | | stove | 51.75 | 66.57 | | palm | 31.06 | 49.94 | | kitchen island | 28.26 | 48.41 | | computer | 45.84 | 55.37 | | swivel chair | 36.89 | 49.21 | | boat | 34.02 | 47.03 | | bar | 17.87 | 21.98 | | arcade machine | 21.92 | 23.24 | | hovel | 27.14 | 30.77 | | bus | 76.09 | 87.96 | | towel | 40.49 | 51.85 | | light | 10.75 | 11.83 | | truck | 13.83 | 20.46 | | tower | 28.84 | 51.7 | | chandelier | 48.64 | 63.32 | | awning | 12.97 | 15.66 | | streetlight | 6.29 | 7.54 | | booth | 39.89 | 41.24 | | television receiver | 49.13 | 63.42 | | airplane | 38.36 | 49.03 | | dirt track | 15.0 | 44.83 | | apparel | 34.89 | 48.4 | | pole | 4.6 | 5.83 | | land | 0.0 | 0.0 | | bannister | 1.86 | 2.24 | | escalator | 37.15 | 58.46 | | ottoman | 26.62 | 40.61 | | bottle | 23.7 | 41.7 | | buffet | 37.52 | 40.71 | | poster | 19.86 | 26.47 | | stage | 9.75 | 14.74 | | van | 35.43 | 46.85 | | ship | 61.77 | 78.56 | | fountain | 18.23 | 18.28 | | conveyer belt | 64.42 | 80.17 | | canopy | 12.39 | 16.71 | | washer | 53.9 | 56.8 | | plaything | 8.58 | 14.86 | | swimming pool | 75.12 | 77.36 | | stool | 24.09 | 31.26 | | barrel | 40.64 | 64.46 | | basket | 12.94 | 17.82 | | waterfall | 50.01 | 70.54 | | tent | 88.81 | 94.95 | | bag | 5.23 | 6.55 | | minibike | 35.54 | 46.8 | | cradle | 69.65 | 86.88 | | oven | 39.82 | 55.83 | | ball | 33.58 | 44.95 | | food | 35.29 | 46.14 | | step | 10.83 | 13.11 | | tank | 34.48 | 35.62 | | trade name | 10.29 | 11.62 | | microwave | 35.8 | 40.25 | | pot | 22.39 | 27.37 | | animal | 36.51 | 40.61 | | bicycle | 26.01 | 46.8 | | lake | 29.66 | 43.6 | | dishwasher | 47.15 | 53.76 | | screen | 53.12 | 68.1 | | blanket | 9.01 | 10.13 | | sculpture | 29.07 | 43.17 | | hood | 43.26 | 48.76 | | sconce | 16.06 | 18.78 | | vase | 13.88 | 19.99 | | traffic light | 10.01 | 14.2 | | tray | 1.61 | 3.48 | | ashcan | 23.27 | 29.62 | | fan | 22.05 | 26.8 | | pier | 22.65 | 32.97 | | crt screen | 3.83 | 9.49 | | plate | 22.12 | 29.61 | | monitor | 1.66 | 2.03 | | bulletin board | 33.58 | 37.64 | | shower | 0.0 | 0.0 | | radiator | 34.99 | 39.75 | | glass | 1.29 | 1.36 | | clock | 16.41 | 18.68 | | flag | 21.15 | 22.57 | +---------------------+-------+-------+ 2024/10/26 16:54:42 - mmengine - INFO - Iter(val) [500/500] aAcc: 76.3000 mIoU: 36.6200 mAcc: 47.0900 data_time: 0.0018 time: 0.0331