NOT IN CLUB Jump to content

Harness the power of AI through standards support

A community for UK businesses in the Innovate UK BridgeAI programme to connect, and collaborate with BSI’s standards expertise, unlocking the full potential of AI in a responsible, ethical and trustworthy way.

Sign up
  • Where to start with AI Assurance: A Q&A with Saket Mohan


    Jon

    By @Tahira, AI project manager, BSI 

    Having rigorous assurance in place for your AI deployments demonstrates to your stakeholders that your business can manage risks, operate safely, and achieve the potential AI can bring. Standards can guide you to implementing this secure foundation but for many SMEs it can be hard to know where to start or which standards to use. 

    Our upcoming webinar on AI Assurance looks to help with this and answer questions SMEs may have. One of the panellists who will be joining the event is Saket Mohan, an Innovate UK BridgeAI grant winner and founder of Secure Elements.

    Secure Elements is an SME which focuses on cybersecurity engineering in AI and provides automotive cybersecurity and safety analysis.

    In this blog we speak to Saket to get a better understanding of the AI assurance and standards journey, including the key questions an SME should ask when seeking an AI assurance provider. 

    What should SMEs and start-ups focus on when starting out on their AI journeys, particularly in the context of cybersecurity?

    The first thing to do is to establish a set of guidelines and process documents which include considerations for standards, local legislation, and regulations. You can usually find these easily in the public domain.

    Understanding and applying these guidelines is the best thing an organisation can do to show market compliance and become operational quickly. The next step is identifying or collecting high-quality data. You will then need to apply the established standards and regulations to this data. From here you can select the appropriate type of machine learning and guidance to train your model.

    Once your model generates the desired outputs, it should be rigorously stress-tested. Regarding cybersecurity, it’s vital to ensure the legitimacy and appropriateness of the data for training purposes and address any sensitive or critical information that could lead to reputational damage or legal breaches.

    What tools and resources are you using that have helped you?

    For testing and monitoring performance, we’ve used The Department for Science, Innovation and Technology (DSIT) open-source AI testing framework. It provides a performance score, so we can evaluate our model's effectiveness.

    We use a GDPR assessment to ensure the quality of our data. Good testing of any system considers various frameworks, such as the AI Risk Management Framework from NIST, especially when dealing with client data.

    We also utilize OWASP standards for software bill of materials using Cyclone DX, ensuring secure software exchange within the supply chain.

    What standards does your organization use or plan to use for developing AI solutions? 

    Secure Elements is a cybersecurity company, so we put standards and best practices at the forefront of what we do. All our tools and software are developed using standards and codes.

    A few examples of this are:

    We also adhere to UN regulations R155 – Cyber security and cyber security management system, and R156 – Software update and software update management system.  As we start incorporating AI algorithms into our software models, we are applying ISO 42001 too. 

    How have adopting these standards benefitted you?

    By adopting standards, which are widely adopted across 58 countries, it means our products will always align with industry mandates. We see compliance as important for selling into the supply chain, as non-compliance could prevent us from market entry.

    A key reason we adopt these standards and regulations, even if not locally mandated, is because they represent a common approach. Regulations are often based on standards and adhering to standards prepares us for future regulation and safeguards our business model. 

    This strategy means we’re always up to date and can have business continuity despite the introduction of new regulations and controls.

    Are you considering certification or assurance in cybersecurity, and why?

    Yes, we are pursuing ISO 27001 certification and aiming for a cybersecurity certificate of compliance from the National Cyber Security Centre (NCSC). We’re relying on these certifications to protect our business and show our clients that we are a trusted supplier. 

    Ultimately, our goal is to enter the market responsibly and with good cybersecurity practices. Because of that, we use standards as often as possible but expect to use more as we acquire more responsibility.

    From a data and cybersecurity perspective, what do you look for when seeking out new companies to do business with, especially regarding the development and deployment of AI?

    We prioritize the quality of data and training methods. Businesses should be asking questions about the model or method through which the AI has been trained. At Secure Elements, we ask for a verification validation matrix from suppliers to ensure the security of the models we procure or sell.

    What kind of questions would you ask a prospective vendor or AI solution provider from a data and cybersecurity perspective?

    Our perfect list of questions looks something like this: 

    1. Can you demonstrate your processes and governance models for handling cybersecurity and AI data?
    2. What applicable standards, regulations, and best practices do you consider when developing?
    3. Do you incorporate feedback from customers and clients to improve your models?
    4. Do you have a team dedicated to assessing model efficacy and safety?
    5. Can you explain your model and provide model explainability?

    Do you have questions you’d like to ask about AI assurance and cybersecurity?  Join us at our upcoming AI Assurance & Cybersecurity webinar on Thursday, 25 July, where Saket and others on our expert panel will share further insights and answer your questions. 

    Sign up for the webinar


    User Feedback

    Recommended Comments

    There are no comments to display.



    Create an account or sign in to comment

    You need to be a member in order to leave a comment

    Create an account

    Sign up for a new account in our community. It's easy!

    Register a new account

    Sign in

    Already have an account? Sign in here.

    Sign In Now

  • Sign up for the BridgeAI Standards Community

    The InnovateUK BridgeAI programme empowers SMEs in the high-potential sectors of Agriculture/Agrifood, Construction, Creative, and Transportation to bridge the gap to successful AI adoption, unlocking the potential for greater growth and productivity.  

    The BridgeAI Standards Community supports SMEs in the programme to collaborate and learn together how to harness the power of AI in a safe and trustworthy way through the use of standards. Membership in the community is free.

×
×
  • Create New...