« Cecil's been at it, again. | Main | The joke was on me - one potato, two potatoes.... »

August 02, 2020

Comments

The problem with convolutional neural networks and indeed algorithms like ID3, which can generate a classifier (tree) from a dataset by data mining, is that they cannot generate explanations of how they work, which makes it difficult for us humans to understand them and thus assess their ability. I had one of my AI team, Regina Reppenhagen, implement ID3 to generate decision trees as classifiers and another student to generate natural language explanations from the decision trees.

However, ID3 is susceptible to noise in the training atasets and doesn't know where to set a breakoff point amongst the noise, so it can't assess its own reliability :-(

Stu--Don't make me go down in there to convolve matrices. The last time I played with convolution was in the Rapid Runway Repair program at Tyndall AFB FL in 1981-1983. I don't even understand the language of convolutional neural networks. *sigh* That said, at this point in time I just have to use a "black box" functional relationship - or does that even make sense in this application? You brilliant programming geeks are too high up for me to even see you.

I need that AI for bird identification of look-alike birds!

I would share, had I the AI.

Verify your Comment

Previewing your Comment

This is only a preview. Your comment has not yet been posted.

Working...
Your comment could not be posted. Error type:
Your comment has been posted. Post another comment

The letters and numbers you entered did not match the image. Please try again.

As a final step before posting your comment, enter the letters and numbers you see in the image below. This prevents automated programs from posting comments.

Having trouble reading this image? View an alternate.

Working...

Post a comment

Your Information

(Name and email address are required. Email address will not be displayed with the comment.)