python 的语法定义和C++、matlab、java 还是很有区别的。
1. 括号与函数调用
- def devided_3(x):
- return x/3.
print(a) #不带括号调用的结果:<function a at 0x139c756a8>
print(a(3)) #带括号调用的结果:1
不带括号时,调用的是函数在内存在的首地址; 带括号时,调用的是函数在内存区的代码块,输入参数后执行函数体。
2. 括号与类调用
- class test():
- y = 'this is out of __init__()'
- def __init__(self):
- self.y = 'this is in the __init__()'
-
- x = test # x是类位置的首地址
- print(x.y) # 输出类的内容:this is out of __init__()
- x = test() # 类的实例化
- print(x.y) # 输出类的属性:this is in the __init__() ;
3. function(#) (input)
- def With_func_rtn(a):
- print("this is func with another func as return")
- print(a)
- def func(b):
- print("this is another function")
- print(b)
- return func
- func(2018)(11)
- >>> this is func with another func as return
- 2018
- this is another function
- 11
其实,这种情况最常用在卷积神经网络中:
- def model(input_shape):
- # Define the input placeholder as a tensor with shape input_shape.
- X_input = Input(input_shape)
- # Zero-Padding: pads the border of X_input with zeroes
- X = ZeroPadding2D((3, 3))(X_input)
- # CONV -> BN -> RELU Block applied to X
- X = Conv2D(32, (7, 7), strides = (1, 1), name = 'conv0')(X)
- X = BatchNormalization(axis = 3, name = 'bn0')(X)
- X = Activation('relu')(X)
- # MAXPOOL
- X = MaxPooling2D((2, 2), name='max_pool')(X)
- # FLATTEN X (means convert it to a vector) + FULLYCONNECTED
- X = Flatten()(X)
- X = Dense(1, activation='sigmoid', name='fc')(X)
- # Create model. This creates your Keras model instance, you'll use this instance to train/test the model.
- model = Model(inputs = X_input, outputs = X, name='HappyModel')
- return model
总结
以上所述是小编给大家介绍的Python 中 function(#) (X)格式 和 (#)在Python3.*中的注意,希望对大家有所帮助,如果大家有任何疑问请给我留言,小编会及时回复大家的。在此也非常感谢大家对w3xue网站的支持!