INCUBATION | AI Explainability 360 is an open source toolkit that can help users better understand the ways that machine learning models predict labels using a wide variety of techniques throughout the AI application lifecycle. GitHub: To be updatedhttps://github.com/ai-explainability |
AI Explainability 360 Charter
To be updated
Reference Information
- Website To be updated
- Github To be updated
- Github
- Mail Lists Mail Lists To be updated
- https://lists.lfai.foundation/g/trusted-ai-explainability-360-announce
- https://lists.lfai.foundation/g/trusted-ai-explainability-360-technical-discuss
- https://lists.lfai.foundation/g/trusted-ai-explainability-360-tsc
- Wiki
- Artwork
- Project Lead: To be updatedAnimesh Singh, singhan@us.ibm.com
- LF AI Technical Advisory Council (TAC) Sponsor: Ibrahim Haddad, LF AI Executive Director (temporary)
Recent space activity
Recently Updated | ||||||||
---|---|---|---|---|---|---|---|---|
|
Space contributors
Contributors | ||||||||||
---|---|---|---|---|---|---|---|---|---|---|
|