兰德系数(Rand Index)

NOTE:

1、一个小案例说明兰德系数是如何怎么计算的。很详细!

2、关于兰德系数的讨论:https://stats.stackexchange.com/questions/89030/rand-index-calculation

注意:

+--------------------------------+--------------------------------------+
| TP:                            | FN:                                  |
| Same class + same cluster      | Same class + different clusters      |
+--------------------------------+--------------------------------------+
| FP:                            | TN:                                  |
| different class + same cluster | different class + different clusters |
+--------------------------------+--------------------------------------+

例如:

给定一个共生矩阵,

  | 1 | 2 | 3
--+---+---+---
x | 5 | 1 | 2
--+---+---+---
o | 1 | 4 | 0
--+---+---+---
◊ | 0 | 1 | 3

其中,x,o,◊表示class,1、2、3表示cluster。(ps:具体详情,请打开上述的链接即可弄明白。)

行方向表示same class,可求得TP+FN的值;列方向表示same cluster,可求得TP+FP的值。

总个数组合“选2个”,可求得总值TP+FN+FP+TN。矩阵的的每个元素组合“选2个”,即可求得TP的值。

剩下的,相减即可。

兰德系数(Rand Index) = (TP+TN)/(TP+FP+FN+TN);

准确度(Precision) = (TP)/(TP+FP);

召回率(Recall)=(TP)/(TP+FN);

Python版:

from keras.datasets import fashion_mnist
import numpy as np

def rand_index(y_true, y_pred):
    n = len(y_true)
    a, b = 0, 0
    for i in range(n):
        for j in range(i+1, n):
            if (y_true[i] == y_true[j]) & (y_pred[i] == y_pred[j]):
                a +=1
            elif (y_true[i] != y_true[j]) & (y_pred[i] != y_pred[j]):
                b +=1
            else:
                pass
    RI = (a + b) / (n*(n-1)/2)
    return RI

_, (X, y_true) = fashion_mnist.load_data()
y_random = np.random.randint(0, 10, 10000)


print(rand_index(y_true, y_random))
## output : 0.8200785478547855

Matlab版:

function RI = RandIndex(X,Y)
%RANDINDEX X&Y are two vectors representing the label of the dataset.
% X input; Y output.

X = X(:);
Y = Y(:);

TP = 0;
TN = 0;
N = length(X);
C_N2 = nchoosek(N,2);
PIS = [1:N];                                            % PIS: Positive Integer Sequence
ordinal_sequence = combntns(PIS,2);
for k = 1:C_N2
    index = ordinal_sequence(k,:);
    if (X(index(1))==X(index(2)) & Y(index(1))==Y(index(2)));
        TP = TP+1;
    end
    if (X(index(1))~=X(index(2)) & Y(index(1))~=Y(index(2)));
        TN = TN+1;
    end
end
RI = (TP+TN)/C_N2;   

自写Matlab版(望诸位指正):

close all;clear;clc;

%% 
%二值图像
SEG = im2bw(imread('0_8_8976_GCMF.png'));%分割图
GT = im2bw(imread('0_8_8976_mask.png'));%真值图

subplot(2,2,1);imshow(GT);
title('GT image','Color', 'g');
hold on;
subplot(2,2,2);imshow(SEG);
title('SEG image','Color', 'r');
hold on;
subplot(2,2,3);imshow(GT&SEG);
title('GT \cap SEG ','Color', 'b');
hold on;
subplot(2,2,4);imshow(GT|SEG);
title('GT \cup SEG ','Color', 'm');

%%
%统计各个元素的数量:
%前景-前景(f_f)\ 前景-背景(f_b)
%背景-前景(b_f)\ 前景-前景(b_b)
f_f = length(find(GT==1&SEG==1));
f_b = length(find(GT==1&SEG==0));
b_f = length(find(GT==0&SEG==1));
b_b = length(find(GT==0&SEG==0));

%%
%分别统计真值图、分割图的前景像素点和背景像素点的个数
gt_f = f_f + f_b;
gt_b = b_f + b_b;

seg_f = f_f + b_f;
seg_b = f_b + b_b;

%%
%计算tp、tn、fp、fn
num_pixels = gt_f + gt_b;
total = nchoosek(num_pixels,2);

tp_plus_fp = nchoosek(seg_f,2)+nchoosek(seg_b,2);
tp_plus_fn = nchoosek(gt_f,2)+nchoosek(gt_b,2);

tp = nchoosek(f_f,2)+nchoosek(b_f,2)+nchoosek(b_b,2)+nchoosek(f_b,2);
fp = tp_plus_fp - tp;
fn = tp_plus_fn - tp;
tn = total - tp - fp - fn;

%%
%计算准确率、召回率、兰德系数
precision = tp/(tp_plus_fp);
recall = tp/(tp_plus_fn);
rand_index = (tp+tn)/total;

disp(['precision = ',num2str(precision)]);
disp(['recall = ',num2str(recall)]);
disp(['rand_index = ',num2str(rand_index)]);

 

你可能感兴趣的:(日常心得(小笔记),图像分割,聚类指标,评价系数)