Benchmark

Claude Code is 4.2x faster & 3.2x cheaper with CustomGPT.ai plugin. See the report →

CustomGPT.ai Blog

How to Add AI in my Client Portal

CustomGPT.ai’s End User IdP Login feature allows you to choose which users in your existing login system can view and chat with AI chatbots you’ve created with CustomGPT. Users authenticate through your Identity Provider (IdP) and get routed to only the agents their role permits, without you managing separate CustomGPT accounts.

TL;DR

Add client portal AI by gating chat with your enterprise IdP, then routing entitlements through roles. End User IdP Login maps an IdP attribute to chat-only roles so each external user sees only the agent(s) they’re allowed to use, via one Portal Login URL or an IdP-gated embed.
  • Default: one Portal Login URL for all users
  • Entitlements: IdP attribute value must match role name
  • Measurement: anonymous analytics by design
  • Security: SAML-based SSO, chat-only access

The Portal AI Goal

You want portal AI that feels native to your client or partner experience, without building a new portal app layer. The fastest path is to treat AI as a secured “chat surface” that inherits identity and entitlements you already manage. You already built the chatbot. Now you have to deploy it safely at scale, which means controlling access and reducing permission mistakes. IBM reports the global average cost of a data breach has reached $4.4 million in 2025. With End User IdP Login, users authenticate through your IdP and no CustomGPT end-user account is created, so you avoid “public” or “anonymous link” confusion while still keeping end-users chat-only. Learn more about deploying agents without managing users. This article stays focused on adding AI to an existing client portal experience. It does not re-teach full Single Sign-On (SSO) rollout steps; if you need that first, use the SSO setup guide as your prerequisite path.

Login is Not Routing

In B2B portals, “enterprise SSO” usually means your clients or partners sign in using an Identity Provider (IdP) instead of a new password. That SSO step proves who they are, but it does not automatically decide what they can access. People sometimes say “partner SSO” to mean federated login for external partners, not just internal employees. In standards terms, federation lets an identity system provide authentication attributes and optionally subscriber attributes to a relying party. An “SSO portal” is simply a portal that fronts access with SSO. The crucial second half is authorization: who is entitled to which features, projects, or in this case, which AI agents. End User IdP Login explicitly handles that by mapping an IdP-sent attribute to a role you create. For the SSO spine, CustomGPT SSO is configured using SAML, including SAML configuration import, and common IdP guides use SAML 2.0. (OIDC is common in other portal stacks, but don’t assume it’s your CustomGPT SSO mode unless your setup says so.) Learn more about SSO configuration.

One URL For Everyone

The simplest operating model is one portal entry point for every external user, with routing determined after IdP authentication. That reduces admin overhead because you avoid per-client link management and “which URL did we send them” support churn. With End User IdP Login, you share a Portal Login URL that’s unique to your organization and has the format https://app.customgpt.ai/portal/[random-string]. After login, users either land directly in chat for a single entitled agent or see a selection portal if they’re entitled to multiple agents. If your stakeholders expect a custom domain for the portal entry link, set expectations early. The current limitation is that the Portal URL cannot be customized or white-labelled at this time.

Example Tier Routing

Imagine a partner portal where Tier A users should see a “Partner Support Agent” and Tier B users should see both “Partner Support Agent” and “Implementation Agent.” You model tier as an IdP attribute value and create matching roles, then assign the right agents to each role. The result is one Portal Login URL for the whole ecosystem, but each user sees only the agent set that matches their tier. If your IdP sends multiple attribute values for a user, they can see a portal listing all accessible agents. Learn more about creating agent-specific custom roles.

How it Works

From the end-user’s perspective, the flow should feel like “log in, then chat,” not “create a new account somewhere else.” This section is what portal owners can paste into release notes and what security teams can sanity-check.
  1. Portal flow: Visit → IdP login → routed to allowed agent → chat
  2. Embed flow: Page or intranet or portal → sign in → chat
After authentication, CustomGPT grants a 24-hour session that is fixed and not configurable. Each authentication starts a new conversation, and conversation history is not preserved between sessions for IdP end-users. End-users remain chat-only and anonymous in analytics, and if they try to access non-chat pages they are redirected back to the agent portal. This keeps the portal experience “AI inside the boundary” rather than a broader app exposure. Learn more about what your external users experience.

Setup Snapshot

This setup is intentionally “SSO first, then AI access control,” so identity remains owned by IT/Security and your portal team focuses on experience and entitlements. You’ll need the feature enabled on your plan, SSO already configured, and an attribute ready in your IdP. In CustomGPT Teams, you create the roles that external users will map into under Teams and Roles. The role name must exactly match the IdP attribute value (case-sensitive), and Chat Only is strongly recommended so external users can interact without administrative access. You then enable IdP End-User Access Control in your profile’s Single Sign On settings, enter the attribute name (for example customgpt_role), and copy the generated Portal Login URL to share. Routing succeeds only when the IdP sends the correct attribute name and the attribute value matches a role that has at least one private agent assigned. If you want the AI inside an existing portal page, you can use the IdP-gated private deployment path by setting Agent Visibility to Private and selecting Enabled (IdP) under Private Agent Deployment. The embed guide also calls out that this is tied to a Teams enterprise plan with SSO enabled. See the full guide on deploying IdP access-controlled agents to external websites.

Privacy Safe Measurement

Security-conscious portal owners usually want outcomes without turning the portal AI into a personal tracking system. End User IdP Login supports that posture by keeping end-users anonymous in analytics and avoiding end-user account creation. You should also document the measurement tradeoffs up front. Current limitations include that analytics cannot track usage by role or show which attribute was used, and detailed access audit logs are not available yet. Operationally, you can still monitor usage and quality at the platform level using agent analytics and conversation views, and you can review chronological activity in event logs. For compliance posture, Conversation retention period lets you manage how long conversations are stored. For access management, see how to update and revoke external agent access. If you need the agent to guide users to a defined next step, Drive Conversions provides a goal-oriented action with usage tracking, but it’s explicitly positioned as a premium feature.

This is the concise proof layer most IT and security reviewers need: authentication protocol, access control model, session behavior, and current audit limitations. Forward it as-is to speed up approval without pulling engineers into a long thread.

  1. SAML 2.0 SSO: CustomGPT SSO is set up by importing SAML configuration, and common IdP guides use SAML 2.0 app setup including Microsoft Azure and Google Workspace.
  2. Role-based access: Your IdP sends an attribute; CustomGPT matches the attribute value to an exact role name you create.
  3. Chat-only external users: End-users are not created as CustomGPT accounts, remain chat-only, and appear anonymous in analytics.
  4. Session expectations: Sessions are 24 hours (fixed). Access changes take effect on next authentication, up to 24 hours.
  5. Audit tradeoffs today: Portal URL works with share link and embedded deployments; detailed access audit logs are not available yet.

Success looks like your security team signs off on the model, your portal team ships one stable login entry point, and support tickets shift from “access confusion” to “better answers” improvements.

Conclusion

If you already have a B2B client or partner portal, the lowest-effort secure pattern is End User IdP Login with a single Portal Login URL. Industry research shows AI-enhanced customer portals are expected to support 50% of all customer service issues without human intervention by 2025. Your IdP remains the source of truth, and your portal team controls which agents map to which roles. The global chatbot market is projected to reach $27.3 billion by 2030, but deployment security remains a critical decision point. You can embed a chatbot publicly, but “behind login” requires access control. For security posture, set expectations that sessions are 24 hours and analytics are anonymous by design, and document the current limitations around role-level analytics and detailed access audit logs. Then iterate on agent quality, not login plumbing. Start building your portal agents with a free trial, then contact sales to enable Teams and End User IdP Login for secure client access.

Frequently Asked Questions

Can I use AI inside a private client portal instead of a public website?

Yes. You can keep AI chat behind your existing portal login and let your identity provider control who can access it. The supported model is to authenticate users through your IdP, then route them to only the agent or agents their role permits, using one Portal Login URL or an IdP-gated embed. Elizabeth Planet described the value of controlled knowledge sources this way: “I added a couple of trusted sources to the chatbot and the answers improved tremendously! You can rely on the responses it gives you because it’s only pulling from curated information.” That is especially useful in a private portal where users need answers from approved client content rather than public pages.

How do I route different client tiers or partner roles to different AI assistants after login?

Treat authentication and authorization as separate steps. First, the user signs in through your IdP. Second, the IdP sends an attribute value. Third, that value must match the role name you created, and the user is shown only the assistant tied to that role. One shared login entry point can work for many client tiers or partner groups, as long as entitlement mapping happens after login and access is denied when there is no valid role match.

Do clients need separate chatbot accounts to use AI in the portal?

No. End users authenticate through your existing identity provider, and no separate end-user account is created for them. They get chat-only access, while your team keeps identity and entitlement management in the login system you already use. Aslan AI summed up the integration benefit this way: “From beginning to end of the project, CustomGPT was the solution. With further integration of new features, we might even abandon some tools like Bubble or ChatPDF.”

What prevents one client from seeing another client’s AI assistant?

Role-based authorization is the main safeguard. SSO proves who the user is, but it does not decide what they can access. To prevent cross-client exposure, your IdP should send the entitlement attribute, that value should match the correct role name, and the user should see only the agent or agents allowed for that role. If the attribute is missing or incorrect, access should fail closed. For security posture, the provided materials also state SOC 2 Type 2 certification and SAML-based SSO support.

Is one login URL enough for every client in the portal?

Usually yes. The default operating model is one Portal Login URL for all external users, with routing handled after IdP authentication. That keeps the experience simpler for users and reduces admin overhead compared with managing separate AI entry links for each client. The Tokenizer described the value of connecting a large specialized knowledge base to a controlled user experience: “Based on our huge database, which we have built up over the past three years, and in close cooperation with CustomGPT, we have launched this amazing regulatory service, which both law firms and a wide range of industry professionals in our space will benefit greatly from.”

Can I add a client-specific AI assistant for a long proposal or account documents inside the portal?

Yes. You can create an assistant around a proposal, account, or document set, then restrict access to the matching portal role so only the right users can open it. This works well for document-heavy use cases because the system is RAG-powered and supports files such as PDF, DOCX, TXT, CSV, HTML, XML, JSON, audio, video, and URLs, with a maximum file size of 100MB per file. In the provided benchmark, CustomGPT.ai outperformed OpenAI in RAG accuracy, which is relevant when users ask detailed questions about specific clauses or account materials.

3x productivity.
Cut costs in half.

Launch a custom AI agent in minutes.

Instantly access all your data.
Automate customer service.
Streamline employee training.
Accelerate research.
Gain customer insights.

Try 100% free. Cancel anytime.