{"group":{"id":1,"name":"Community","lockable":false,"created_at":"2012-01-18T18:02:15.000Z","updated_at":"2026-04-06T14:01:22.000Z","description":"Problems submitted by members of the MATLAB Central community.","is_default":true,"created_by":161519,"badge_id":null,"featured":false,"trending":false,"solution_count_in_trending_period":0,"trending_last_calculated":"2026-04-06T00:00:00.000Z","image_id":null,"published":true,"community_created":false,"status_id":2,"is_default_group_for_player":false,"deleted_by":null,"deleted_at":null,"restored_by":null,"restored_at":null,"description_opc":null,"description_html":null,"published_at":null},"problems":[{"id":58882,"title":"Neural Nets: Activation functions","description":"Return values of selected Activation function type for value,vector, and matrices.\r\ny=Activation(x,id); where id is 1:4 for ReLU, sigmoid, hyperbolic_tan, Softmax\r\nReLU: Rectified Linear Unit, clips negatives  max(0,x)   Trains faster than sigmoid\r\nSigmoid: Exponential normalization [0:1]      \r\nHyperTan: Normalization[-1:1]   tanh(x)\r\nSoftmax: Normalizes output sum to 1, individual values [0:1]      Used on Output node\r\n\r\nWorking though a series of Neural Net challenges from Perceptron, Hidden Layers, Back Propogation, ..., to the Convolutional Neural Net/Training for Handwritten Digits from Mnist. \r\nMight take a day or two  to completely cover Neural Nets in a Matlab centric fashion. \r\nEssentially Out=Softmax(ReLU(X*W)*WP)","description_html":"\u003cdiv style = \"text-align: start; line-height: 20.4333px; min-height: 0px; white-space: normal; color: rgb(0, 0, 0); font-family: Menlo, Monaco, Consolas, monospace; font-style: normal; font-size: 14px; font-weight: 400; text-decoration: rgb(0, 0, 0); white-space: normal; \"\u003e\u003cdiv style=\"block-size: 312px; display: block; min-width: 0px; padding-block-start: 0px; padding-top: 0px; perspective-origin: 407px 156px; transform-origin: 407px 156px; vertical-align: baseline; \"\u003e\u003cdiv style=\"block-size: 21px; font-family: Helvetica, Arial, sans-serif; line-height: 21px; margin-block-end: 9px; margin-block-start: 2px; margin-bottom: 9px; margin-inline-end: 10px; margin-inline-start: 4px; margin-left: 4px; margin-right: 10px; margin-top: 2px; perspective-origin: 384px 10.5px; text-align: left; transform-origin: 384px 10.5px; white-space: pre-wrap; margin-left: 4px; margin-top: 2px; margin-bottom: 9px; margin-right: 10px; \"\u003e\u003cspan style=\"block-size: auto; display: inline; margin-block-end: 0px; margin-block-start: 0px; margin-bottom: 0px; margin-inline-end: 0px; margin-inline-start: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; perspective-origin: 252.5px 8px; transform-origin: 252.5px 8px; unicode-bidi: normal; \"\u003e\u003cspan style=\"\"\u003eReturn values of selected Activation function type for value,vector, and matrices.\u003c/span\u003e\u003c/span\u003e\u003c/div\u003e\u003cdiv style=\"block-size: 21px; font-family: Helvetica, Arial, sans-serif; line-height: 21px; margin-block-end: 9px; margin-block-start: 2px; margin-bottom: 9px; margin-inline-end: 10px; margin-inline-start: 4px; margin-left: 4px; margin-right: 10px; margin-top: 2px; perspective-origin: 384px 10.5px; text-align: left; transform-origin: 384px 10.5px; white-space: pre-wrap; margin-left: 4px; margin-top: 2px; margin-bottom: 9px; margin-right: 10px; \"\u003e\u003cspan style=\"block-size: auto; display: inline; margin-block-end: 0px; margin-block-start: 0px; margin-bottom: 0px; margin-inline-end: 0px; margin-inline-start: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; perspective-origin: 240.5px 8px; transform-origin: 240.5px 8px; unicode-bidi: normal; \"\u003e\u003cspan style=\"\"\u003ey=Activation(x,id); where id is 1:4 for ReLU, sigmoid, hyperbolic_tan, Softmax\u003c/span\u003e\u003c/span\u003e\u003c/div\u003e\u003cdiv style=\"block-size: 21px; font-family: Helvetica, Arial, sans-serif; line-height: 21px; margin-block-end: 9px; margin-block-start: 2px; margin-bottom: 9px; margin-inline-end: 10px; margin-inline-start: 4px; margin-left: 4px; margin-right: 10px; margin-top: 2px; perspective-origin: 384px 10.5px; text-align: left; transform-origin: 384px 10.5px; white-space: pre-wrap; margin-left: 4px; margin-top: 2px; margin-bottom: 9px; margin-right: 10px; \"\u003e\u003cspan style=\"block-size: auto; display: inline; margin-block-end: 0px; margin-block-start: 0px; margin-bottom: 0px; margin-inline-end: 0px; margin-inline-start: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; perspective-origin: 253.5px 8px; transform-origin: 253.5px 8px; unicode-bidi: normal; \"\u003e\u003cspan style=\"\"\u003eReLU: Rectified Linear Unit, clips negatives  max(0,x)   Trains faster than sigmoid\u003c/span\u003e\u003c/span\u003e\u003c/div\u003e\u003cdiv style=\"block-size: 21px; font-family: Helvetica, Arial, sans-serif; line-height: 21px; margin-block-end: 9px; margin-block-start: 2px; margin-bottom: 9px; margin-inline-end: 10px; margin-inline-start: 4px; margin-left: 4px; margin-right: 10px; margin-top: 2px; perspective-origin: 384px 10.5px; text-align: left; transform-origin: 384px 10.5px; white-space: pre-wrap; margin-left: 4px; margin-top: 2px; margin-bottom: 9px; margin-right: 10px; \"\u003e\u003cspan style=\"block-size: auto; display: inline; margin-block-end: 0px; margin-block-start: 0px; margin-bottom: 0px; margin-inline-end: 0px; margin-inline-start: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; perspective-origin: 137px 8px; transform-origin: 137px 8px; unicode-bidi: normal; \"\u003e\u003cspan style=\"\"\u003eSigmoid: Exponential normalization [0:1]      \u003c/span\u003e\u003c/span\u003e\u003cspan style=\"vertical-align:-5px\"\u003e\u003cimg src=\"data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAJEAAAAnCAYAAAAPS6pLAAAFo0lEQVR4Xu2bSeseRRDGkw8g4vIJooKCEMFEQfQiuMZbFE0iIiiuJ8ENl0MwEc1F8OACHnIwqEguguICCi4XNxS8uRxUPLniB9DnB1NhHHupfnvemf+b6YaHP0kv01X9dHVVdb/bt7XSNFCpge2V/Vv3poFtjUSNBNUaaCSqVmEboJGocaBaA3OS6CbN/j3ht2op2gCzamAuEu2Q1N8LFwhfOzRwhtpcL7zoaNua1GmgeHPPRaI7Jeftwm6HvBDomPCQk3COIVuThAZ2qu7pEn2XkggL8tcIR9BnGuMlh2WBQG93hPNYLNMNiihp31j1Xw2gv+PCXo8evSSCPPcLdwveIyi2MEzwK+Es4YfM6kGg94UjzlW+VO0eEy4STnf2ac3CGkCXzwhX54xGjkRYgic68tinakl0WANd2E0utYAPqvIO4WzHKpsJvqpr+0cj0f+0xlpeIdwqoKd9wqsC6/GI8JqAP9Qv1OO7PppagxSJrlXH84VvhIPCrm6gWhJ9p3GwFkwwVhD4W+GeTDv641+d2s3zZf09TWgkiusWC/NRR5ovu2bvdH+HLoAFQJep/uPYkDlLZP2wCjhblBoSmQBnapxUaF9ihfqycfyxy9ZFItNDaNcm9sSWq/qnm5FHDnT6pzC0UieEmppEz3VfxsKkyu+qfF5ImtHAAEsgkRHAw8wnIzo0PXkMAuR5RYj6sFOTCHLcLLyV0ADH6JvCnky70BBLIBHW3Fv+VsNQlMpmJkjy6NiOtLvUPpinm5JEMJrJ56Imc/RyR97JTiKChVOEX4RcFOslFe0g4dHOssQs1XA8Nv+7QvBIm5JELk9fEzVr4p1bX+BNt0SWSmGxOM6vFAhovIudI5Pl3e5TQ5zrzwUSvlj/1OmAXomSg5Gyd6FqHWsm/6vgOYNh/acC+YnSsskkIsp8oVtYy81YIIIeVrHMpj/G/km4TvhQYEMTJePncKTdIkQdZ9UlN/ZUJEKIB2JMHjAFx5GQc0kkMh+FnMzFgkWuRqzaaJMbAgjzlGCJWxsbi/d475uhjWtGJOhcT0Ui7zUHAsxJIizmeQnzt191OKSQ/FCiXYkfY4vJcH1Lbbky8l5RpzYxhzGrjETBfNEUJDLv3nPNMTeJ+sdHzSJ4F910w7f6ORvzXfCHsBS5lEjNXD19ZyeR95rDhJnTEtn1SUyxOJZsBo4X/LZYeVYVKUfV+uGb3Nj9Ayv0s3Cb8LDA0YYDHM0Ue1Z/pDbJiHkKS+S55ujLSntwsvtEfSsEKSEN5XWB43IrvUKY1bH2XnP0SbSUEN+SqsjuPf5GMizFw7AmHLHB91/rtkSWXEyFj0OJkpFARvxNCvH7aRNP5rh45UfsgIsR9c3WTSLPNcdQVntvtIpiN4lE/ajMnmWEdMGFsvc91Yi8OTGUrUdsju7fna2SbMRc8zQjd80REhyfiDR7aVSySSTq+0RkjnGw+9cb9pKhnzdaB0lyYzIPHP1zhODLC48lQlgWlKiEwltnz84g8sASlRKBb2QnHpB8GJ4n38DkNBepH/spiEU9fA7HmjCfQi4Knd8bW7gV579Kt+yGTpEI8hBu8gpxWNgxRA+xX1+UXHOEBCt5lAZ5SAIy32H5Qv/xieAJtz0KHptEfJP5X9PpmXc7Hwj8lGrMS1ePbKE2zO0NgXxVdD4eS7TKBHCkyeh6nrbGxrfjMGpGV5lYZR+Ueonwo5B6mVn5mS3TnZsGUg7Jk2ddJHJ93KEq7yM2x1CtSaEGsLqXC9l83TpIVHrNkZMNZ/noQnZ+ThdT1bt/6cGE1kEiGHyD4PlholcpWCR7wuDt09qtpgG7+jmg7q6fuK+DRKXXHF5Rcw+nvOO0dmkNFOt5bBJZYqrmAVVb5A3TwNgkwh86VxgrpN4wdS5zumOTaJlaXLjUjUQLJ8AY4jcSjaHFhY/xLzQKbDe+g/quAAAAAElFTkSuQmCC\" style=\"width: 72.5px; height: 19.5px;\" width=\"72.5\" height=\"19.5\"\u003e\u003c/span\u003e\u003c/div\u003e\u003cdiv style=\"block-size: 21px; font-family: Helvetica, Arial, sans-serif; line-height: 21px; margin-block-end: 9px; margin-block-start: 2px; margin-bottom: 9px; margin-inline-end: 10px; margin-inline-start: 4px; margin-left: 4px; margin-right: 10px; margin-top: 2px; perspective-origin: 384px 10.5px; text-align: left; transform-origin: 384px 10.5px; white-space: pre-wrap; margin-left: 4px; margin-top: 2px; margin-bottom: 9px; margin-right: 10px; \"\u003e\u003cspan style=\"block-size: auto; display: inline; margin-block-end: 0px; margin-block-start: 0px; margin-bottom: 0px; margin-inline-end: 0px; margin-inline-start: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; perspective-origin: 120.5px 8px; transform-origin: 120.5px 8px; unicode-bidi: normal; \"\u003e\u003cspan style=\"\"\u003eHyperTan: Normalization[-1:1]   tanh(x)\u003c/span\u003e\u003c/span\u003e\u003c/div\u003e\u003cdiv style=\"block-size: 21px; font-family: Helvetica, Arial, sans-serif; line-height: 21px; margin-block-end: 9px; margin-block-start: 2px; margin-bottom: 9px; margin-inline-end: 10px; margin-inline-start: 4px; margin-left: 4px; margin-right: 10px; margin-top: 2px; perspective-origin: 384px 10.5px; text-align: left; transform-origin: 384px 10.5px; white-space: pre-wrap; margin-left: 4px; margin-top: 2px; margin-bottom: 9px; margin-right: 10px; \"\u003e\u003cspan style=\"block-size: auto; display: inline; margin-block-end: 0px; margin-block-start: 0px; margin-bottom: 0px; margin-inline-end: 0px; margin-inline-start: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; perspective-origin: 192px 8px; transform-origin: 192px 8px; unicode-bidi: normal; \"\u003e\u003cspan style=\"\"\u003eSoftmax: Normalizes output sum to 1, individual values [0:1]   \u003c/span\u003e\u003c/span\u003e\u003cspan style=\"vertical-align:-5px\"\u003e\u003cimg src=\"data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAFYAAAAnCAYAAACVKL+TAAAEHElEQVRoQ+2ZN+sWQRDG//8PIJg+gaGwUjA1NgoGtFRQ0U4xFTaigtoIKga0NIA2oqi9iAEUTGBEKwtDaWVA/QD6PLAjy7Lxbufuinvh4Q23u7fzu9nZmX0nJ8aXCoFJlVHHQSdGsEpOMIIdwSoRUBp2yB67DDb/gd4r2a467JDBfoLlZ6BLqgSUBh8q2Pmw9x00E/quZLvqsEMFexxWz4Y2WdYT9pQWND50+ZCGCvYHIOyBblogCftQC7ALuozXNcHOwsRXQvuNt9HjfkHXodXQrsx4uRbtrkFzPR72Cr8tMnB/ms9fArBn4Pdt0ClzvQbYbBtrghX7DhhjCHI5dBX6DX2FQhBsNufNF3qs+yKsF+bB8dpnaKnnAdj9ZD41wGbbqAGWadITM4NcL5UJE9xHaCt0J+CJjLWPoGnm+j28rwm05c8c8xtUE2zSRg2wNOZvpje5PLhZHYPmREDxEtvdsNqcwOfDkT6M1YzPOSsmcev/l6M2aoFlDsoYW5ouEQCXdwySWOZuZptxwd7scgE1bRe1sRQsl9U8M5OngRkxpu0wYNfhPbSk3e7cGAi1ZMkS5EZroJK+IaBVbMwFy6W3z8zkvllWhLAesktOxr/L0FmIS1WWKHf6FOCdaLMdWlzgQoRwF7IzBV82kTNkVRtTYOlFt8zE7aVGY5hCEdw5iGnNc+gIdBBijU/wFMtSvlKlKVMpPpRUOxdS6Wbm9lexMQbWnrC7uzOB567M3x9AryH+thcSz+QyXQWdhE671jjfpYRlXG6ywdi7dEn4UbMxBNbOF+mxdmkpeSET9OkJYLmXfSVsbl+2E7Al6Z2qjSGwTNJ3G8vsDUHSHEJl8l/rSI87LMNIk11dvO4C+udkE/LAVG30gZVlyQmIt9IjaDjjKmPoFajWqVOshE15rRQUL9EwViS446jb6ANrP0nGTsZSZgKPIcbTWkBtz+FnXwkbAysZAeezJWNe3KT4YgxXt9EHVg46asbQEKCcEjbUl2GDaZab8oXaE+ZtiJuruo0+sCzV+KK3luSUMe8KXWPMpsGlmyD7sG9unJcHKO3VbfSBlVSKYBm3fEufuzjjbJPUyIZcUsJKPxYSF6HctIox/CjEVE6KB3UbUzHWTV+ksmqSyLte2+TUyT18KVkldmizY6yKjT6wDPKyaXHizAzeQiugJVDsSK/EUHoeD8VTJ1kyplthldyLbe3jRXUbYwXCBhPDpuL9DfQMStX7JcaWlrAMPwtLbuC0fYjvdgXIFaNmY+qsoIUd0a5yktW0hNWaV7Vx+wIr3leS1FczuouB+gLbpoTtgkvre/QBVg5MSv9daG1slwP0ATb2L2yXtqveqw+wTM5rpWyqcNoM3gdY5q+l/xK0sbGXvn2A7cXQrm86glUiPoJVAvsPgJgaNxW1h50AAAAASUVORK5CYII=\" style=\"width: 43px; height: 19.5px;\" width=\"43\" height=\"19.5\"\u003e\u003c/span\u003e\u003cspan style=\"block-size: auto; display: inline; margin-block-end: 0px; margin-block-start: 0px; margin-bottom: 0px; margin-inline-end: 0px; margin-inline-start: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; perspective-origin: 73.5px 8px; transform-origin: 73.5px 8px; unicode-bidi: normal; \"\u003e\u003cspan style=\"\"\u003e   Used on Output node\u003c/span\u003e\u003c/span\u003e\u003c/div\u003e\u003cdiv style=\"block-size: 21px; font-family: Helvetica, Arial, sans-serif; line-height: 21px; margin-block-end: 9px; margin-block-start: 2px; margin-bottom: 9px; margin-inline-end: 10px; margin-inline-start: 4px; margin-left: 4px; margin-right: 10px; margin-top: 2px; perspective-origin: 384px 10.5px; text-align: left; transform-origin: 384px 10.5px; white-space: pre-wrap; margin-left: 4px; margin-top: 2px; margin-bottom: 9px; margin-right: 10px; \"\u003e\u003cspan style=\"block-size: auto; display: inline; margin-block-end: 0px; margin-block-start: 0px; margin-bottom: 0px; margin-inline-end: 0px; margin-inline-start: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; perspective-origin: 0px 8px; transform-origin: 0px 8px; unicode-bidi: normal; \"\u003e\u003cspan style=\"\"\u003e\u003c/span\u003e\u003c/span\u003e\u003c/div\u003e\u003cdiv style=\"block-size: 42px; font-family: Helvetica, Arial, sans-serif; line-height: 21px; margin-block-end: 9px; margin-block-start: 2px; margin-bottom: 9px; margin-inline-end: 10px; margin-inline-start: 4px; margin-left: 4px; margin-right: 10px; margin-top: 2px; perspective-origin: 384px 21px; text-align: left; transform-origin: 384px 21px; white-space: pre-wrap; margin-left: 4px; margin-top: 2px; margin-bottom: 9px; margin-right: 10px; \"\u003e\u003cspan style=\"block-size: auto; display: inline; margin-block-end: 0px; margin-block-start: 0px; margin-bottom: 0px; margin-inline-end: 0px; margin-inline-start: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; perspective-origin: 352.5px 8px; transform-origin: 352.5px 8px; unicode-bidi: normal; \"\u003e\u003cspan style=\"\"\u003eWorking though a series of Neural Net challenges from Perceptron, Hidden Layers, Back Propogation, ..., to the Convolutional Neural Net/Training for Handwritten Digits from Mnist. \u003c/span\u003e\u003c/span\u003e\u003c/div\u003e\u003cdiv style=\"block-size: 21px; font-family: Helvetica, Arial, sans-serif; line-height: 21px; margin-block-end: 9px; margin-block-start: 2px; margin-bottom: 9px; margin-inline-end: 10px; margin-inline-start: 4px; margin-left: 4px; margin-right: 10px; margin-top: 2px; perspective-origin: 384px 10.5px; text-align: left; transform-origin: 384px 10.5px; white-space: pre-wrap; margin-left: 4px; margin-top: 2px; margin-bottom: 9px; margin-right: 10px; \"\u003e\u003cspan style=\"block-size: auto; display: inline; margin-block-end: 0px; margin-block-start: 0px; margin-bottom: 0px; margin-inline-end: 0px; margin-inline-start: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; perspective-origin: 265px 8px; transform-origin: 265px 8px; unicode-bidi: normal; \"\u003e\u003cspan style=\"\"\u003eMight take a day or two  to completely cover Neural Nets in a Matlab centric fashion. \u003c/span\u003e\u003c/span\u003e\u003c/div\u003e\u003cdiv style=\"block-size: 21px; font-family: Helvetica, Arial, sans-serif; line-height: 21px; margin-block-end: 9px; margin-block-start: 2px; margin-bottom: 9px; margin-inline-end: 10px; margin-inline-start: 4px; margin-left: 4px; margin-right: 10px; margin-top: 2px; perspective-origin: 384px 10.5px; text-align: left; transform-origin: 384px 10.5px; white-space: pre-wrap; margin-left: 4px; margin-top: 2px; margin-bottom: 9px; margin-right: 10px; \"\u003e\u003cspan style=\"block-size: auto; display: inline; margin-block-end: 0px; margin-block-start: 0px; margin-bottom: 0px; margin-inline-end: 0px; margin-inline-start: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; perspective-origin: 130px 8px; transform-origin: 130px 8px; unicode-bidi: normal; \"\u003e\u003cspan style=\"\"\u003eEssentially Out=Softmax(ReLU(X*W)*WP)\u003c/span\u003e\u003c/span\u003e\u003c/div\u003e\u003c/div\u003e\u003c/div\u003e","function_template":"function y = Activation(x,id)\r\n%id/function 1/ReLU 2/Sigmoid 3/Hyperbolictan 4/Softmax\r\n%x may be a point, vector or matrix\r\n  y = x;\r\nend","test_suite":"%%\r\nvalid=1;\r\nx = 2;\r\ny = Activation(x,1);\r\nif y~=max(0,x),valid=0;end\r\ny = Activation(x,2);\r\nif y~=1./(1+exp(-x)),valid=0;end\r\ny = Activation(x,3);\r\nif y~=tanh(x),valid=0;end\r\ny = Activation(x,4);\r\nif y~=1,valid=0;end\r\nassert(valid)\r\n%%\r\nvalid=1;\r\nx = [-1 0 1 2];\r\ny = Activation(x,1);\r\nif y~=max(0,x),valid=0;end\r\ny = Activation(x,2);\r\nif y~=1./(1+exp(-x)),valid=0;end\r\ny = Activation(x,3);\r\nif y~=tanh(x),valid=0;end\r\ny = Activation(x,4);\r\nif y~=exp(x)./sum(exp(x)),valid=0;end\r\nassert(valid)\r\n%%\r\nvalid=1;\r\nx = [-1 0 1 2;.5 .25 -2 5];\r\ny = Activation(x,1);\r\nif y~=max(0,x),valid=0;end\r\ny = Activation(x,2);\r\nif y~=1./(1+exp(-x)),valid=0;end\r\ny = Activation(x,3);\r\nif y~=tanh(x),valid=0;end\r\ny = Activation(x,4);\r\nif y~=exp(x)./sum(sum(exp(x))),valid=0;end\r\nassert(valid)\r\n","published":true,"deleted":false,"likes_count":1,"comments_count":1,"created_by":3097,"edited_by":3097,"edited_at":"2023-08-19T14:59:54.000Z","deleted_by":null,"deleted_at":null,"solvers_count":15,"test_suite_updated_at":null,"rescore_all_solutions":false,"group_id":1,"created_at":"2023-08-19T14:02:05.000Z","updated_at":"2026-03-04T20:22:48.000Z","published_at":"2023-08-19T14:59:56.000Z","restored_at":null,"restored_by":null,"spam":null,"simulink":false,"admin_reviewed":false,"description_opc":"{\"parts\":[{\"partUri\":\"/matlab/document.xml\",\"contentType\":\"application/vnd.mathworks.matlab.code.document+xml\",\"content\":\"\u003c?xml version=\\\"1.0\\\" encoding=\\\"UTF-8\\\"?\u003e\u003cw:document xmlns:w=\\\"http://schemas.openxmlformats.org/wordprocessingml/2006/main\\\"\u003e\u003cw:body\u003e\u003cw:p\u003e\u003cw:pPr\u003e\u003cw:pStyle w:val=\\\"text\\\"/\u003e\u003cw:jc w:val=\\\"left\\\"/\u003e\u003c/w:pPr\u003e\u003cw:r\u003e\u003cw:t\u003eReturn values of selected Activation function type for value,vector, and matrices.\u003c/w:t\u003e\u003c/w:r\u003e\u003c/w:p\u003e\u003cw:p\u003e\u003cw:pPr\u003e\u003cw:pStyle w:val=\\\"text\\\"/\u003e\u003cw:jc w:val=\\\"left\\\"/\u003e\u003c/w:pPr\u003e\u003cw:r\u003e\u003cw:t\u003ey=Activation(x,id); where id is 1:4 for ReLU, sigmoid, hyperbolic_tan, Softmax\u003c/w:t\u003e\u003c/w:r\u003e\u003c/w:p\u003e\u003cw:p\u003e\u003cw:pPr\u003e\u003cw:pStyle w:val=\\\"text\\\"/\u003e\u003cw:jc w:val=\\\"left\\\"/\u003e\u003c/w:pPr\u003e\u003cw:r\u003e\u003cw:t\u003eReLU: Rectified Linear Unit, clips negatives  max(0,x)   Trains faster than sigmoid\u003c/w:t\u003e\u003c/w:r\u003e\u003c/w:p\u003e\u003cw:p\u003e\u003cw:pPr\u003e\u003cw:pStyle w:val=\\\"text\\\"/\u003e\u003cw:jc w:val=\\\"left\\\"/\u003e\u003c/w:pPr\u003e\u003cw:r\u003e\u003cw:t\u003eSigmoid: Exponential normalization [0:1]      \u003c/w:t\u003e\u003c/w:r\u003e\u003cw:customXml w:element=\\\"equation\\\"\u003e\u003cw:customXmlPr\u003e\u003cw:attr w:name=\\\"displayStyle\\\" w:val=\\\"false\\\"/\u003e\u003c/w:customXmlPr\u003e\u003cw:r\u003e\u003cw:t\u003e1/(1+e^{-x})\u003c/w:t\u003e\u003c/w:r\u003e\u003c/w:customXml\u003e\u003c/w:p\u003e\u003cw:p\u003e\u003cw:pPr\u003e\u003cw:pStyle w:val=\\\"text\\\"/\u003e\u003cw:jc w:val=\\\"left\\\"/\u003e\u003c/w:pPr\u003e\u003cw:r\u003e\u003cw:t\u003eHyperTan: Normalization[-1:1]   tanh(x)\u003c/w:t\u003e\u003c/w:r\u003e\u003c/w:p\u003e\u003cw:p\u003e\u003cw:pPr\u003e\u003cw:pStyle w:val=\\\"text\\\"/\u003e\u003cw:jc w:val=\\\"left\\\"/\u003e\u003c/w:pPr\u003e\u003cw:r\u003e\u003cw:t\u003eSoftmax: Normalizes output sum to 1, individual values [0:1]   \u003c/w:t\u003e\u003c/w:r\u003e\u003cw:customXml w:element=\\\"equation\\\"\u003e\u003cw:customXmlPr\u003e\u003cw:attr w:name=\\\"displayStyle\\\" w:val=\\\"false\\\"/\u003e\u003c/w:customXmlPr\u003e\u003cw:r\u003e\u003cw:t\u003ee^x/\\\\Sigma e^x\u003c/w:t\u003e\u003c/w:r\u003e\u003c/w:customXml\u003e\u003cw:r\u003e\u003cw:t\u003e   Used on Output node\u003c/w:t\u003e\u003c/w:r\u003e\u003c/w:p\u003e\u003cw:p\u003e\u003cw:pPr\u003e\u003cw:pStyle w:val=\\\"text\\\"/\u003e\u003cw:jc w:val=\\\"left\\\"/\u003e\u003c/w:pPr\u003e\u003cw:r\u003e\u003cw:t\u003e\u003c/w:t\u003e\u003c/w:r\u003e\u003c/w:p\u003e\u003cw:p\u003e\u003cw:pPr\u003e\u003cw:pStyle w:val=\\\"text\\\"/\u003e\u003cw:jc w:val=\\\"left\\\"/\u003e\u003c/w:pPr\u003e\u003cw:r\u003e\u003cw:t\u003eWorking though a series of Neural Net challenges from Perceptron, Hidden Layers, Back Propogation, ..., to the Convolutional Neural Net/Training for Handwritten Digits from Mnist. \u003c/w:t\u003e\u003c/w:r\u003e\u003c/w:p\u003e\u003cw:p\u003e\u003cw:pPr\u003e\u003cw:pStyle w:val=\\\"text\\\"/\u003e\u003cw:jc w:val=\\\"left\\\"/\u003e\u003c/w:pPr\u003e\u003cw:r\u003e\u003cw:t\u003eMight take a day or two  to completely cover Neural Nets in a Matlab centric fashion. \u003c/w:t\u003e\u003c/w:r\u003e\u003c/w:p\u003e\u003cw:p\u003e\u003cw:pPr\u003e\u003cw:pStyle w:val=\\\"text\\\"/\u003e\u003cw:jc w:val=\\\"left\\\"/\u003e\u003c/w:pPr\u003e\u003cw:r\u003e\u003cw:t\u003eEssentially Out=Softmax(ReLU(X*W)*WP)\u003c/w:t\u003e\u003c/w:r\u003e\u003c/w:p\u003e\u003c/w:body\u003e\u003c/w:document\u003e\",\"relationship\":null}],\"relationships\":[{\"relationshipType\":\"http://schemas.mathworks.com/matlab/code/2013/relationships/document\",\"target\":\"/matlab/document.xml\",\"relationshipId\":\"rId1\"}]}"}],"problem_search":{"errors":[],"problems":[{"id":58882,"title":"Neural Nets: Activation functions","description":"Return values of selected Activation function type for value,vector, and matrices.\r\ny=Activation(x,id); where id is 1:4 for ReLU, sigmoid, hyperbolic_tan, Softmax\r\nReLU: Rectified Linear Unit, clips negatives  max(0,x)   Trains faster than sigmoid\r\nSigmoid: Exponential normalization [0:1]      \r\nHyperTan: Normalization[-1:1]   tanh(x)\r\nSoftmax: Normalizes output sum to 1, individual values [0:1]      Used on Output node\r\n\r\nWorking though a series of Neural Net challenges from Perceptron, Hidden Layers, Back Propogation, ..., to the Convolutional Neural Net/Training for Handwritten Digits from Mnist. \r\nMight take a day or two  to completely cover Neural Nets in a Matlab centric fashion. \r\nEssentially Out=Softmax(ReLU(X*W)*WP)","description_html":"\u003cdiv style = \"text-align: start; line-height: 20.4333px; min-height: 0px; white-space: normal; color: rgb(0, 0, 0); font-family: Menlo, Monaco, Consolas, monospace; font-style: normal; font-size: 14px; font-weight: 400; text-decoration: rgb(0, 0, 0); white-space: normal; \"\u003e\u003cdiv style=\"block-size: 312px; display: block; min-width: 0px; padding-block-start: 0px; padding-top: 0px; perspective-origin: 407px 156px; transform-origin: 407px 156px; vertical-align: baseline; \"\u003e\u003cdiv style=\"block-size: 21px; font-family: Helvetica, Arial, sans-serif; line-height: 21px; margin-block-end: 9px; margin-block-start: 2px; margin-bottom: 9px; margin-inline-end: 10px; margin-inline-start: 4px; margin-left: 4px; margin-right: 10px; margin-top: 2px; perspective-origin: 384px 10.5px; text-align: left; transform-origin: 384px 10.5px; white-space: pre-wrap; margin-left: 4px; margin-top: 2px; margin-bottom: 9px; margin-right: 10px; \"\u003e\u003cspan style=\"block-size: auto; display: inline; margin-block-end: 0px; margin-block-start: 0px; margin-bottom: 0px; margin-inline-end: 0px; margin-inline-start: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; perspective-origin: 252.5px 8px; transform-origin: 252.5px 8px; unicode-bidi: normal; \"\u003e\u003cspan style=\"\"\u003eReturn values of selected Activation function type for value,vector, and matrices.\u003c/span\u003e\u003c/span\u003e\u003c/div\u003e\u003cdiv style=\"block-size: 21px; font-family: Helvetica, Arial, sans-serif; line-height: 21px; margin-block-end: 9px; margin-block-start: 2px; margin-bottom: 9px; margin-inline-end: 10px; margin-inline-start: 4px; margin-left: 4px; margin-right: 10px; margin-top: 2px; perspective-origin: 384px 10.5px; text-align: left; transform-origin: 384px 10.5px; white-space: pre-wrap; margin-left: 4px; margin-top: 2px; margin-bottom: 9px; margin-right: 10px; \"\u003e\u003cspan style=\"block-size: auto; display: inline; margin-block-end: 0px; margin-block-start: 0px; margin-bottom: 0px; margin-inline-end: 0px; margin-inline-start: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; perspective-origin: 240.5px 8px; transform-origin: 240.5px 8px; unicode-bidi: normal; \"\u003e\u003cspan style=\"\"\u003ey=Activation(x,id); where id is 1:4 for ReLU, sigmoid, hyperbolic_tan, Softmax\u003c/span\u003e\u003c/span\u003e\u003c/div\u003e\u003cdiv style=\"block-size: 21px; font-family: Helvetica, Arial, sans-serif; line-height: 21px; margin-block-end: 9px; margin-block-start: 2px; margin-bottom: 9px; margin-inline-end: 10px; margin-inline-start: 4px; margin-left: 4px; margin-right: 10px; margin-top: 2px; perspective-origin: 384px 10.5px; text-align: left; transform-origin: 384px 10.5px; white-space: pre-wrap; margin-left: 4px; margin-top: 2px; margin-bottom: 9px; margin-right: 10px; \"\u003e\u003cspan style=\"block-size: auto; display: inline; margin-block-end: 0px; margin-block-start: 0px; margin-bottom: 0px; margin-inline-end: 0px; margin-inline-start: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; perspective-origin: 253.5px 8px; transform-origin: 253.5px 8px; unicode-bidi: normal; \"\u003e\u003cspan style=\"\"\u003eReLU: Rectified Linear Unit, clips negatives  max(0,x)   Trains faster than sigmoid\u003c/span\u003e\u003c/span\u003e\u003c/div\u003e\u003cdiv style=\"block-size: 21px; font-family: Helvetica, Arial, sans-serif; line-height: 21px; margin-block-end: 9px; margin-block-start: 2px; margin-bottom: 9px; margin-inline-end: 10px; margin-inline-start: 4px; margin-left: 4px; margin-right: 10px; margin-top: 2px; perspective-origin: 384px 10.5px; text-align: left; transform-origin: 384px 10.5px; white-space: pre-wrap; margin-left: 4px; margin-top: 2px; margin-bottom: 9px; margin-right: 10px; \"\u003e\u003cspan style=\"block-size: auto; display: inline; margin-block-end: 0px; margin-block-start: 0px; margin-bottom: 0px; margin-inline-end: 0px; margin-inline-start: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; perspective-origin: 137px 8px; transform-origin: 137px 8px; unicode-bidi: normal; \"\u003e\u003cspan style=\"\"\u003eSigmoid: Exponential normalization [0:1]      \u003c/span\u003e\u003c/span\u003e\u003cspan style=\"vertical-align:-5px\"\u003e\u003cimg src=\"data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAJEAAAAnCAYAAAAPS6pLAAAFo0lEQVR4Xu2bSeseRRDGkw8g4vIJooKCEMFEQfQiuMZbFE0iIiiuJ8ENl0MwEc1F8OACHnIwqEguguICCi4XNxS8uRxUPLniB9DnB1NhHHupfnvemf+b6YaHP0kv01X9dHVVdb/bt7XSNFCpge2V/Vv3poFtjUSNBNUaaCSqVmEboJGocaBaA3OS6CbN/j3ht2op2gCzamAuEu2Q1N8LFwhfOzRwhtpcL7zoaNua1GmgeHPPRaI7Jeftwm6HvBDomPCQk3COIVuThAZ2qu7pEn2XkggL8tcIR9BnGuMlh2WBQG93hPNYLNMNiihp31j1Xw2gv+PCXo8evSSCPPcLdwveIyi2MEzwK+Es4YfM6kGg94UjzlW+VO0eEy4STnf2ac3CGkCXzwhX54xGjkRYgic68tinakl0WANd2E0utYAPqvIO4WzHKpsJvqpr+0cj0f+0xlpeIdwqoKd9wqsC6/GI8JqAP9Qv1OO7PppagxSJrlXH84VvhIPCrm6gWhJ9p3GwFkwwVhD4W+GeTDv641+d2s3zZf09TWgkiusWC/NRR5ovu2bvdH+HLoAFQJep/uPYkDlLZP2wCjhblBoSmQBnapxUaF9ihfqycfyxy9ZFItNDaNcm9sSWq/qnm5FHDnT6pzC0UieEmppEz3VfxsKkyu+qfF5ImtHAAEsgkRHAw8wnIzo0PXkMAuR5RYj6sFOTCHLcLLyV0ADH6JvCnky70BBLIBHW3Fv+VsNQlMpmJkjy6NiOtLvUPpinm5JEMJrJ56Imc/RyR97JTiKChVOEX4RcFOslFe0g4dHOssQs1XA8Nv+7QvBIm5JELk9fEzVr4p1bX+BNt0SWSmGxOM6vFAhovIudI5Pl3e5TQ5zrzwUSvlj/1OmAXomSg5Gyd6FqHWsm/6vgOYNh/acC+YnSsskkIsp8oVtYy81YIIIeVrHMpj/G/km4TvhQYEMTJePncKTdIkQdZ9UlN/ZUJEKIB2JMHjAFx5GQc0kkMh+FnMzFgkWuRqzaaJMbAgjzlGCJWxsbi/d475uhjWtGJOhcT0Ui7zUHAsxJIizmeQnzt191OKSQ/FCiXYkfY4vJcH1Lbbky8l5RpzYxhzGrjETBfNEUJDLv3nPNMTeJ+sdHzSJ4F910w7f6ORvzXfCHsBS5lEjNXD19ZyeR95rDhJnTEtn1SUyxOJZsBo4X/LZYeVYVKUfV+uGb3Nj9Ayv0s3Cb8LDA0YYDHM0Ue1Z/pDbJiHkKS+S55ujLSntwsvtEfSsEKSEN5XWB43IrvUKY1bH2XnP0SbSUEN+SqsjuPf5GMizFw7AmHLHB91/rtkSWXEyFj0OJkpFARvxNCvH7aRNP5rh45UfsgIsR9c3WTSLPNcdQVntvtIpiN4lE/ajMnmWEdMGFsvc91Yi8OTGUrUdsju7fna2SbMRc8zQjd80REhyfiDR7aVSySSTq+0RkjnGw+9cb9pKhnzdaB0lyYzIPHP1zhODLC48lQlgWlKiEwltnz84g8sASlRKBb2QnHpB8GJ4n38DkNBepH/spiEU9fA7HmjCfQi4Knd8bW7gV579Kt+yGTpEI8hBu8gpxWNgxRA+xX1+UXHOEBCt5lAZ5SAIy32H5Qv/xieAJtz0KHptEfJP5X9PpmXc7Hwj8lGrMS1ePbKE2zO0NgXxVdD4eS7TKBHCkyeh6nrbGxrfjMGpGV5lYZR+Ueonwo5B6mVn5mS3TnZsGUg7Jk2ddJHJ93KEq7yM2x1CtSaEGsLqXC9l83TpIVHrNkZMNZ/noQnZ+ThdT1bt/6cGE1kEiGHyD4PlholcpWCR7wuDt09qtpgG7+jmg7q6fuK+DRKXXHF5Rcw+nvOO0dmkNFOt5bBJZYqrmAVVb5A3TwNgkwh86VxgrpN4wdS5zumOTaJlaXLjUjUQLJ8AY4jcSjaHFhY/xLzQKbDe+g/quAAAAAElFTkSuQmCC\" style=\"width: 72.5px; height: 19.5px;\" width=\"72.5\" height=\"19.5\"\u003e\u003c/span\u003e\u003c/div\u003e\u003cdiv style=\"block-size: 21px; font-family: Helvetica, Arial, sans-serif; line-height: 21px; margin-block-end: 9px; margin-block-start: 2px; margin-bottom: 9px; margin-inline-end: 10px; margin-inline-start: 4px; margin-left: 4px; margin-right: 10px; margin-top: 2px; perspective-origin: 384px 10.5px; text-align: left; transform-origin: 384px 10.5px; white-space: pre-wrap; margin-left: 4px; margin-top: 2px; margin-bottom: 9px; margin-right: 10px; \"\u003e\u003cspan style=\"block-size: auto; display: inline; margin-block-end: 0px; margin-block-start: 0px; margin-bottom: 0px; margin-inline-end: 0px; margin-inline-start: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; perspective-origin: 120.5px 8px; transform-origin: 120.5px 8px; unicode-bidi: normal; \"\u003e\u003cspan style=\"\"\u003eHyperTan: Normalization[-1:1]   tanh(x)\u003c/span\u003e\u003c/span\u003e\u003c/div\u003e\u003cdiv style=\"block-size: 21px; font-family: Helvetica, Arial, sans-serif; line-height: 21px; margin-block-end: 9px; margin-block-start: 2px; margin-bottom: 9px; margin-inline-end: 10px; margin-inline-start: 4px; margin-left: 4px; margin-right: 10px; margin-top: 2px; perspective-origin: 384px 10.5px; text-align: left; transform-origin: 384px 10.5px; white-space: pre-wrap; margin-left: 4px; margin-top: 2px; margin-bottom: 9px; margin-right: 10px; \"\u003e\u003cspan style=\"block-size: auto; display: inline; margin-block-end: 0px; margin-block-start: 0px; margin-bottom: 0px; margin-inline-end: 0px; margin-inline-start: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; perspective-origin: 192px 8px; transform-origin: 192px 8px; unicode-bidi: normal; \"\u003e\u003cspan style=\"\"\u003eSoftmax: Normalizes output sum to 1, individual values [0:1]   \u003c/span\u003e\u003c/span\u003e\u003cspan style=\"vertical-align:-5px\"\u003e\u003cimg src=\"data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAFYAAAAnCAYAAACVKL+TAAAEHElEQVRoQ+2ZN+sWQRDG//8PIJg+gaGwUjA1NgoGtFRQ0U4xFTaigtoIKga0NIA2oqi9iAEUTGBEKwtDaWVA/QD6PLAjy7Lxbufuinvh4Q23u7fzu9nZmX0nJ8aXCoFJlVHHQSdGsEpOMIIdwSoRUBp2yB67DDb/gd4r2a467JDBfoLlZ6BLqgSUBh8q2Pmw9x00E/quZLvqsEMFexxWz4Y2WdYT9pQWND50+ZCGCvYHIOyBblogCftQC7ALuozXNcHOwsRXQvuNt9HjfkHXodXQrsx4uRbtrkFzPR72Cr8tMnB/ms9fArBn4Pdt0ClzvQbYbBtrghX7DhhjCHI5dBX6DX2FQhBsNufNF3qs+yKsF+bB8dpnaKnnAdj9ZD41wGbbqAGWadITM4NcL5UJE9xHaCt0J+CJjLWPoGnm+j28rwm05c8c8xtUE2zSRg2wNOZvpje5PLhZHYPmREDxEtvdsNqcwOfDkT6M1YzPOSsmcev/l6M2aoFlDsoYW5ouEQCXdwySWOZuZptxwd7scgE1bRe1sRQsl9U8M5OngRkxpu0wYNfhPbSk3e7cGAi1ZMkS5EZroJK+IaBVbMwFy6W3z8zkvllWhLAesktOxr/L0FmIS1WWKHf6FOCdaLMdWlzgQoRwF7IzBV82kTNkVRtTYOlFt8zE7aVGY5hCEdw5iGnNc+gIdBBijU/wFMtSvlKlKVMpPpRUOxdS6Wbm9lexMQbWnrC7uzOB567M3x9AryH+thcSz+QyXQWdhE671jjfpYRlXG6ywdi7dEn4UbMxBNbOF+mxdmkpeSET9OkJYLmXfSVsbl+2E7Al6Z2qjSGwTNJ3G8vsDUHSHEJl8l/rSI87LMNIk11dvO4C+udkE/LAVG30gZVlyQmIt9IjaDjjKmPoFajWqVOshE15rRQUL9EwViS446jb6ANrP0nGTsZSZgKPIcbTWkBtz+FnXwkbAysZAeezJWNe3KT4YgxXt9EHVg46asbQEKCcEjbUl2GDaZab8oXaE+ZtiJuruo0+sCzV+KK3luSUMe8KXWPMpsGlmyD7sG9unJcHKO3VbfSBlVSKYBm3fEufuzjjbJPUyIZcUsJKPxYSF6HctIox/CjEVE6KB3UbUzHWTV+ksmqSyLte2+TUyT18KVkldmizY6yKjT6wDPKyaXHizAzeQiugJVDsSK/EUHoeD8VTJ1kyplthldyLbe3jRXUbYwXCBhPDpuL9DfQMStX7JcaWlrAMPwtLbuC0fYjvdgXIFaNmY+qsoIUd0a5yktW0hNWaV7Vx+wIr3leS1FczuouB+gLbpoTtgkvre/QBVg5MSv9daG1slwP0ATb2L2yXtqveqw+wTM5rpWyqcNoM3gdY5q+l/xK0sbGXvn2A7cXQrm86glUiPoJVAvsPgJgaNxW1h50AAAAASUVORK5CYII=\" style=\"width: 43px; height: 19.5px;\" width=\"43\" height=\"19.5\"\u003e\u003c/span\u003e\u003cspan style=\"block-size: auto; display: inline; margin-block-end: 0px; margin-block-start: 0px; margin-bottom: 0px; margin-inline-end: 0px; margin-inline-start: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; perspective-origin: 73.5px 8px; transform-origin: 73.5px 8px; unicode-bidi: normal; \"\u003e\u003cspan style=\"\"\u003e   Used on Output node\u003c/span\u003e\u003c/span\u003e\u003c/div\u003e\u003cdiv style=\"block-size: 21px; font-family: Helvetica, Arial, sans-serif; line-height: 21px; margin-block-end: 9px; margin-block-start: 2px; margin-bottom: 9px; margin-inline-end: 10px; margin-inline-start: 4px; margin-left: 4px; margin-right: 10px; margin-top: 2px; perspective-origin: 384px 10.5px; text-align: left; transform-origin: 384px 10.5px; white-space: pre-wrap; margin-left: 4px; margin-top: 2px; margin-bottom: 9px; margin-right: 10px; \"\u003e\u003cspan style=\"block-size: auto; display: inline; margin-block-end: 0px; margin-block-start: 0px; margin-bottom: 0px; margin-inline-end: 0px; margin-inline-start: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; perspective-origin: 0px 8px; transform-origin: 0px 8px; unicode-bidi: normal; \"\u003e\u003cspan style=\"\"\u003e\u003c/span\u003e\u003c/span\u003e\u003c/div\u003e\u003cdiv style=\"block-size: 42px; font-family: Helvetica, Arial, sans-serif; line-height: 21px; margin-block-end: 9px; margin-block-start: 2px; margin-bottom: 9px; margin-inline-end: 10px; margin-inline-start: 4px; margin-left: 4px; margin-right: 10px; margin-top: 2px; perspective-origin: 384px 21px; text-align: left; transform-origin: 384px 21px; white-space: pre-wrap; margin-left: 4px; margin-top: 2px; margin-bottom: 9px; margin-right: 10px; \"\u003e\u003cspan style=\"block-size: auto; display: inline; margin-block-end: 0px; margin-block-start: 0px; margin-bottom: 0px; margin-inline-end: 0px; margin-inline-start: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; perspective-origin: 352.5px 8px; transform-origin: 352.5px 8px; unicode-bidi: normal; \"\u003e\u003cspan style=\"\"\u003eWorking though a series of Neural Net challenges from Perceptron, Hidden Layers, Back Propogation, ..., to the Convolutional Neural Net/Training for Handwritten Digits from Mnist. \u003c/span\u003e\u003c/span\u003e\u003c/div\u003e\u003cdiv style=\"block-size: 21px; font-family: Helvetica, Arial, sans-serif; line-height: 21px; margin-block-end: 9px; margin-block-start: 2px; margin-bottom: 9px; margin-inline-end: 10px; margin-inline-start: 4px; margin-left: 4px; margin-right: 10px; margin-top: 2px; perspective-origin: 384px 10.5px; text-align: left; transform-origin: 384px 10.5px; white-space: pre-wrap; margin-left: 4px; margin-top: 2px; margin-bottom: 9px; margin-right: 10px; \"\u003e\u003cspan style=\"block-size: auto; display: inline; margin-block-end: 0px; margin-block-start: 0px; margin-bottom: 0px; margin-inline-end: 0px; margin-inline-start: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; perspective-origin: 265px 8px; transform-origin: 265px 8px; unicode-bidi: normal; \"\u003e\u003cspan style=\"\"\u003eMight take a day or two  to completely cover Neural Nets in a Matlab centric fashion. \u003c/span\u003e\u003c/span\u003e\u003c/div\u003e\u003cdiv style=\"block-size: 21px; font-family: Helvetica, Arial, sans-serif; line-height: 21px; margin-block-end: 9px; margin-block-start: 2px; margin-bottom: 9px; margin-inline-end: 10px; margin-inline-start: 4px; margin-left: 4px; margin-right: 10px; margin-top: 2px; perspective-origin: 384px 10.5px; text-align: left; transform-origin: 384px 10.5px; white-space: pre-wrap; margin-left: 4px; margin-top: 2px; margin-bottom: 9px; margin-right: 10px; \"\u003e\u003cspan style=\"block-size: auto; display: inline; margin-block-end: 0px; margin-block-start: 0px; margin-bottom: 0px; margin-inline-end: 0px; margin-inline-start: 0px; margin-left: 0px; margin-right: 0px; margin-top: 0px; perspective-origin: 130px 8px; transform-origin: 130px 8px; unicode-bidi: normal; \"\u003e\u003cspan style=\"\"\u003eEssentially Out=Softmax(ReLU(X*W)*WP)\u003c/span\u003e\u003c/span\u003e\u003c/div\u003e\u003c/div\u003e\u003c/div\u003e","function_template":"function y = Activation(x,id)\r\n%id/function 1/ReLU 2/Sigmoid 3/Hyperbolictan 4/Softmax\r\n%x may be a point, vector or matrix\r\n  y = x;\r\nend","test_suite":"%%\r\nvalid=1;\r\nx = 2;\r\ny = Activation(x,1);\r\nif y~=max(0,x),valid=0;end\r\ny = Activation(x,2);\r\nif y~=1./(1+exp(-x)),valid=0;end\r\ny = Activation(x,3);\r\nif y~=tanh(x),valid=0;end\r\ny = Activation(x,4);\r\nif y~=1,valid=0;end\r\nassert(valid)\r\n%%\r\nvalid=1;\r\nx = [-1 0 1 2];\r\ny = Activation(x,1);\r\nif y~=max(0,x),valid=0;end\r\ny = Activation(x,2);\r\nif y~=1./(1+exp(-x)),valid=0;end\r\ny = Activation(x,3);\r\nif y~=tanh(x),valid=0;end\r\ny = Activation(x,4);\r\nif y~=exp(x)./sum(exp(x)),valid=0;end\r\nassert(valid)\r\n%%\r\nvalid=1;\r\nx = [-1 0 1 2;.5 .25 -2 5];\r\ny = Activation(x,1);\r\nif y~=max(0,x),valid=0;end\r\ny = Activation(x,2);\r\nif y~=1./(1+exp(-x)),valid=0;end\r\ny = Activation(x,3);\r\nif y~=tanh(x),valid=0;end\r\ny = Activation(x,4);\r\nif y~=exp(x)./sum(sum(exp(x))),valid=0;end\r\nassert(valid)\r\n","published":true,"deleted":false,"likes_count":1,"comments_count":1,"created_by":3097,"edited_by":3097,"edited_at":"2023-08-19T14:59:54.000Z","deleted_by":null,"deleted_at":null,"solvers_count":15,"test_suite_updated_at":null,"rescore_all_solutions":false,"group_id":1,"created_at":"2023-08-19T14:02:05.000Z","updated_at":"2026-03-04T20:22:48.000Z","published_at":"2023-08-19T14:59:56.000Z","restored_at":null,"restored_by":null,"spam":null,"simulink":false,"admin_reviewed":false,"description_opc":"{\"parts\":[{\"partUri\":\"/matlab/document.xml\",\"contentType\":\"application/vnd.mathworks.matlab.code.document+xml\",\"content\":\"\u003c?xml version=\\\"1.0\\\" encoding=\\\"UTF-8\\\"?\u003e\u003cw:document xmlns:w=\\\"http://schemas.openxmlformats.org/wordprocessingml/2006/main\\\"\u003e\u003cw:body\u003e\u003cw:p\u003e\u003cw:pPr\u003e\u003cw:pStyle w:val=\\\"text\\\"/\u003e\u003cw:jc w:val=\\\"left\\\"/\u003e\u003c/w:pPr\u003e\u003cw:r\u003e\u003cw:t\u003eReturn values of selected Activation function type for value,vector, and matrices.\u003c/w:t\u003e\u003c/w:r\u003e\u003c/w:p\u003e\u003cw:p\u003e\u003cw:pPr\u003e\u003cw:pStyle w:val=\\\"text\\\"/\u003e\u003cw:jc w:val=\\\"left\\\"/\u003e\u003c/w:pPr\u003e\u003cw:r\u003e\u003cw:t\u003ey=Activation(x,id); where id is 1:4 for ReLU, sigmoid, hyperbolic_tan, Softmax\u003c/w:t\u003e\u003c/w:r\u003e\u003c/w:p\u003e\u003cw:p\u003e\u003cw:pPr\u003e\u003cw:pStyle w:val=\\\"text\\\"/\u003e\u003cw:jc w:val=\\\"left\\\"/\u003e\u003c/w:pPr\u003e\u003cw:r\u003e\u003cw:t\u003eReLU: Rectified Linear Unit, clips negatives  max(0,x)   Trains faster than sigmoid\u003c/w:t\u003e\u003c/w:r\u003e\u003c/w:p\u003e\u003cw:p\u003e\u003cw:pPr\u003e\u003cw:pStyle w:val=\\\"text\\\"/\u003e\u003cw:jc w:val=\\\"left\\\"/\u003e\u003c/w:pPr\u003e\u003cw:r\u003e\u003cw:t\u003eSigmoid: Exponential normalization [0:1]      \u003c/w:t\u003e\u003c/w:r\u003e\u003cw:customXml w:element=\\\"equation\\\"\u003e\u003cw:customXmlPr\u003e\u003cw:attr w:name=\\\"displayStyle\\\" w:val=\\\"false\\\"/\u003e\u003c/w:customXmlPr\u003e\u003cw:r\u003e\u003cw:t\u003e1/(1+e^{-x})\u003c/w:t\u003e\u003c/w:r\u003e\u003c/w:customXml\u003e\u003c/w:p\u003e\u003cw:p\u003e\u003cw:pPr\u003e\u003cw:pStyle w:val=\\\"text\\\"/\u003e\u003cw:jc w:val=\\\"left\\\"/\u003e\u003c/w:pPr\u003e\u003cw:r\u003e\u003cw:t\u003eHyperTan: Normalization[-1:1]   tanh(x)\u003c/w:t\u003e\u003c/w:r\u003e\u003c/w:p\u003e\u003cw:p\u003e\u003cw:pPr\u003e\u003cw:pStyle w:val=\\\"text\\\"/\u003e\u003cw:jc w:val=\\\"left\\\"/\u003e\u003c/w:pPr\u003e\u003cw:r\u003e\u003cw:t\u003eSoftmax: Normalizes output sum to 1, individual values [0:1]   \u003c/w:t\u003e\u003c/w:r\u003e\u003cw:customXml w:element=\\\"equation\\\"\u003e\u003cw:customXmlPr\u003e\u003cw:attr w:name=\\\"displayStyle\\\" w:val=\\\"false\\\"/\u003e\u003c/w:customXmlPr\u003e\u003cw:r\u003e\u003cw:t\u003ee^x/\\\\Sigma e^x\u003c/w:t\u003e\u003c/w:r\u003e\u003c/w:customXml\u003e\u003cw:r\u003e\u003cw:t\u003e   Used on Output node\u003c/w:t\u003e\u003c/w:r\u003e\u003c/w:p\u003e\u003cw:p\u003e\u003cw:pPr\u003e\u003cw:pStyle w:val=\\\"text\\\"/\u003e\u003cw:jc w:val=\\\"left\\\"/\u003e\u003c/w:pPr\u003e\u003cw:r\u003e\u003cw:t\u003e\u003c/w:t\u003e\u003c/w:r\u003e\u003c/w:p\u003e\u003cw:p\u003e\u003cw:pPr\u003e\u003cw:pStyle w:val=\\\"text\\\"/\u003e\u003cw:jc w:val=\\\"left\\\"/\u003e\u003c/w:pPr\u003e\u003cw:r\u003e\u003cw:t\u003eWorking though a series of Neural Net challenges from Perceptron, Hidden Layers, Back Propogation, ..., to the Convolutional Neural Net/Training for Handwritten Digits from Mnist. \u003c/w:t\u003e\u003c/w:r\u003e\u003c/w:p\u003e\u003cw:p\u003e\u003cw:pPr\u003e\u003cw:pStyle w:val=\\\"text\\\"/\u003e\u003cw:jc w:val=\\\"left\\\"/\u003e\u003c/w:pPr\u003e\u003cw:r\u003e\u003cw:t\u003eMight take a day or two  to completely cover Neural Nets in a Matlab centric fashion. \u003c/w:t\u003e\u003c/w:r\u003e\u003c/w:p\u003e\u003cw:p\u003e\u003cw:pPr\u003e\u003cw:pStyle w:val=\\\"text\\\"/\u003e\u003cw:jc w:val=\\\"left\\\"/\u003e\u003c/w:pPr\u003e\u003cw:r\u003e\u003cw:t\u003eEssentially Out=Softmax(ReLU(X*W)*WP)\u003c/w:t\u003e\u003c/w:r\u003e\u003c/w:p\u003e\u003c/w:body\u003e\u003c/w:document\u003e\",\"relationship\":null}],\"relationships\":[{\"relationshipType\":\"http://schemas.mathworks.com/matlab/code/2013/relationships/document\",\"target\":\"/matlab/document.xml\",\"relationshipId\":\"rId1\"}]}"}],"term":"tag:\"activation functions\"","current_player_id":null,"fields":[{"name":"page","type":"integer","callback":null,"default":1,"directive":null,"facet":null,"facet_method":"and","operator":null,"param":null,"static":null,"prepend":true},{"name":"per_page","type":"integer","callback":null,"default":50,"directive":null,"facet":null,"facet_method":"and","operator":null,"param":null,"static":null,"prepend":true},{"name":"sort","type":"string","callback":null,"default":null,"directive":null,"facet":null,"facet_method":"and","operator":null,"param":null,"static":null,"prepend":true},{"name":"body","type":"text","callback":null,"default":"*:*","directive":null,"facet":null,"facet_method":"and","operator":null,"param":"term","static":null,"prepend":false},{"name":"group","type":"string","callback":null,"default":null,"directive":"group","facet":true,"facet_method":"and","operator":null,"param":"term","static":null,"prepend":true},{"name":"difficulty_rating_bin","type":"string","callback":null,"default":null,"directive":"difficulty_rating_bin","facet":true,"facet_method":"and","operator":null,"param":"term","static":null,"prepend":true},{"name":"id","type":"integer","callback":null,"default":null,"directive":"id","facet":null,"facet_method":"and","operator":null,"param":"term","static":null,"prepend":true},{"name":"tag","type":"string","callback":null,"default":null,"directive":"tag","facet":null,"facet_method":"and","operator":null,"param":"term","static":null,"prepend":true},{"name":"product","type":"string","callback":null,"default":null,"directive":"product","facet":null,"facet_method":"and","operator":null,"param":"term","static":null,"prepend":true},{"name":"created_at","type":"timeframe","callback":{},"default":null,"directive":"created_at","facet":null,"facet_method":"and","operator":null,"param":"term","static":null,"prepend":true},{"name":"profile_id","type":"integer","callback":null,"default":null,"directive":"author_id","facet":null,"facet_method":"and","operator":null,"param":"term","static":null,"prepend":true},{"name":"created_by","type":"string","callback":null,"default":null,"directive":"author","facet":null,"facet_method":"and","operator":null,"param":"term","static":null,"prepend":true},{"name":"player_id","type":"integer","callback":null,"default":null,"directive":"solver_id","facet":null,"facet_method":"and","operator":null,"param":"term","static":null,"prepend":true},{"name":"player","type":"string","callback":null,"default":null,"directive":"solver","facet":null,"facet_method":"and","operator":null,"param":"term","static":null,"prepend":true},{"name":"solvers_count","type":"integer","callback":null,"default":null,"directive":"solvers_count","facet":null,"facet_method":"and","operator":null,"param":"term","static":null,"prepend":true},{"name":"comments_count","type":"integer","callback":null,"default":null,"directive":"comments_count","facet":null,"facet_method":"and","operator":null,"param":"term","static":null,"prepend":true},{"name":"likes_count","type":"integer","callback":null,"default":null,"directive":"likes_count","facet":null,"facet_method":"and","operator":null,"param":"term","static":null,"prepend":true},{"name":"leader_id","type":"integer","callback":null,"default":null,"directive":"leader_id","facet":null,"facet_method":"and","operator":null,"param":"term","static":null,"prepend":true},{"name":"leading_solution","type":"integer","callback":null,"default":null,"directive":"leading_solution","facet":null,"facet_method":"and","operator":null,"param":"term","static":null,"prepend":true}],"filters":[{"name":"asset_type","type":"string","callback":null,"default":null,"directive":null,"facet":null,"facet_method":"and","operator":null,"param":null,"static":"\"cody:problem\"","prepend":true},{"name":"profile_id","type":"integer","callback":{},"default":null,"directive":null,"facet":null,"facet_method":"and","operator":null,"param":"author_id","static":null,"prepend":true}],"query":{"params":{"per_page":50,"term":"tag:\"activation functions\"","current_player":null,"sort":"map(difficulty_value,0,0,999) asc"},"parser":"MathWorks::Search::Solr::QueryParser","directives":{"term":{"directives":{"tag":[["tag:\"activation functions\"","","\"","activation functions","\""]]}}},"facets":{"#\u003cMathWorks::Search::Field:0x00007f3d95dcf008\u003e":null,"#\u003cMathWorks::Search::Field:0x00007f3d95dcef68\u003e":null},"filters":{"#\u003cMathWorks::Search::Field:0x00007f3d95dce608\u003e":"\"cody:problem\""},"fields":{"#\u003cMathWorks::Search::Field:0x00007f3d95dcf288\u003e":1,"#\u003cMathWorks::Search::Field:0x00007f3d95dcf1e8\u003e":50,"#\u003cMathWorks::Search::Field:0x00007f3d95dcf148\u003e":"map(difficulty_value,0,0,999) asc","#\u003cMathWorks::Search::Field:0x00007f3d95dcf0a8\u003e":"tag:\"activation functions\""},"user_query":{"#\u003cMathWorks::Search::Field:0x00007f3d95dcf0a8\u003e":"tag:\"activation functions\""},"queried_facets":{}},"query_backend":{"connection":{"configuration":{"index_url":"http://index-op-v2/solr/","query_url":"http://search-op-v2/solr/","direct_access_index_urls":["http://index-op-v2/solr/"],"direct_access_query_urls":["http://search-op-v2/solr/"],"timeout":10,"vhost":"search","exchange":"search.topic","heartbeat":30,"pre_index_mode":false,"host":"rabbitmq-eks","port":5672,"username":"cody-search","password":"78X075ddcV44","virtual_host":"search","indexer":"amqp","http_logging":"true","core":"cody"},"query_connection":{"uri":"http://search-op-v2/solr/cody/","proxy":null,"connection":{"parallel_manager":null,"headers":{"User-Agent":"Faraday v1.0.1"},"params":{},"options":{"params_encoder":"Faraday::FlatParamsEncoder","proxy":null,"bind":null,"timeout":null,"open_timeout":null,"read_timeout":null,"write_timeout":null,"boundary":null,"oauth":null,"context":null,"on_data":null},"ssl":{"verify":true,"ca_file":null,"ca_path":null,"verify_mode":null,"cert_store":null,"client_cert":null,"client_key":null,"certificate":null,"private_key":null,"verify_depth":null,"version":null,"min_version":null,"max_version":null},"default_parallel_manager":null,"builder":{"adapter":{"name":"Faraday::Adapter::NetHttp","args":[],"block":null},"handlers":[{"name":"Faraday::Response::RaiseError","args":[],"block":null}],"app":{"app":{"ssl_cert_store":{"verify_callback":null,"error":null,"error_string":null,"chain":null,"time":null},"app":{},"connection_options":{},"config_block":null}}},"url_prefix":"http://search-op-v2/solr/cody/","manual_proxy":false,"proxy":null},"update_format":"RSolr::JSON::Generator","update_path":"update","options":{"url":"http://search-op-v2/solr/cody"}}},"query":{"params":{"per_page":50,"term":"tag:\"activation functions\"","current_player":null,"sort":"map(difficulty_value,0,0,999) asc"},"parser":"MathWorks::Search::Solr::QueryParser","directives":{"term":{"directives":{"tag":[["tag:\"activation functions\"","","\"","activation functions","\""]]}}},"facets":{"#\u003cMathWorks::Search::Field:0x00007f3d95dcf008\u003e":null,"#\u003cMathWorks::Search::Field:0x00007f3d95dcef68\u003e":null},"filters":{"#\u003cMathWorks::Search::Field:0x00007f3d95dce608\u003e":"\"cody:problem\""},"fields":{"#\u003cMathWorks::Search::Field:0x00007f3d95dcf288\u003e":1,"#\u003cMathWorks::Search::Field:0x00007f3d95dcf1e8\u003e":50,"#\u003cMathWorks::Search::Field:0x00007f3d95dcf148\u003e":"map(difficulty_value,0,0,999) asc","#\u003cMathWorks::Search::Field:0x00007f3d95dcf0a8\u003e":"tag:\"activation functions\""},"user_query":{"#\u003cMathWorks::Search::Field:0x00007f3d95dcf0a8\u003e":"tag:\"activation functions\""},"queried_facets":{}},"options":{"fields":["id","difficulty_rating"]},"join":" "},"results":[{"id":58882,"difficulty_rating":"easy"}]}}