-
Notifications
You must be signed in to change notification settings - Fork 2k
feat: sharepoint perm sync #5033
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
The latest updates on your projects. Learn more about Vercel for Git ↗︎
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Greptile Summary
This PR implements SharePoint permission synchronization with certificate-based authentication support. The changes span both frontend and backend, introducing a new authentication method using certificates (.pfx files) alongside the existing client secret method. Key components include:
- New
/credential/private-key
endpoint for handling certificate uploads - SharePoint auto-sync configuration with document and group sync frequencies
- UI components for certificate file upload with validation (10KB size limit, .pfx extension)
- Enterprise Edition permission sync implementation with recursive group resolution
- Integration with existing credential management system
The implementation follows established patterns from other connectors (like Confluence and Google Drive) while adding SharePoint-specific authentication handling.
Confidence score: 3/5
- PR requires careful review of certificate handling and permission sync logic
- Score reflects complexity of permission sync implementation and potential issues in recursive group resolution
- Key files needing attention:
- backend/ee/onyx/external_permissions/sharepoint/utils.py (recursive group resolution)
- backend/ee/onyx/external_permissions/sharepoint/group_sync.py (error handling)
- backend/onyx/server/documents/credential.py (certificate handling)
Potential Issues:
- The recursive group resolution in sharepoint/utils.py could lead to performance issues with deeply nested groups
- Error handling in group_sync.py uses broad exception catching which might mask specific issues
- No explicit cleanup of file handles in private key processing
- Duplicate validation schema in web/src/components/credentials/lib.ts needs refactoring
18 files reviewed, 10 comments
Edit PR Review Bot Settings | Greptile
web/src/components/Field.tsx
Outdated
if (extension !== "pfx") { | ||
setCustomError(`File must have a .pfx extension`); | ||
return false; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
logic: Add check for undefined before using pop() to prevent potential runtime errors
key={key} | ||
name={key} | ||
label={getDisplayNameForCredentialKey(key)} | ||
maxSizeKB={isPrivateKey ? 10 : undefined} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
style: consider making the 10KB limit a named constant at component/module level for better maintainability
maxSizeKB={isPrivateKey ? 10 : undefined} | |
const PRIVATE_KEY_MAX_SIZE_KB = 10; | |
maxSizeKB={isPrivateKey ? PRIVATE_KEY_MAX_SIZE_KB : undefined} |
} else if (def instanceof File) { | ||
// File upload fields with size and extension validation | ||
schemaFields[key] = Yup.mixed() | ||
.required(`Please select a ${displayName} file`) | ||
.test("fileSize", "File size must be less than 10KB", (value) => { | ||
if (!value) return false; // Require file | ||
if (value instanceof File) { | ||
return value.size <= 10 * 1024; // 10KB in bytes | ||
} | ||
return false; | ||
}) | ||
.test("fileExtension", "File must have .pfx extension", (value) => { | ||
if (!value) return false; // Require file | ||
if (value instanceof File) { | ||
return value.name.toLowerCase().endsWith(".pfx"); | ||
} | ||
return false; | ||
}); | ||
} else { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
style: Consider extracting the file validation schema into a reusable function to avoid duplication with lines 84-102
""" | ||
Execute a SharePoint query with retry logic for rate limiting. | ||
""" | ||
retries = 0 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
logic: retries variable initialized but never incremented before first query execution
retries = 0 | |
retries = 0 | |
while retries < max_retries: | |
try: | |
return query_obj.execute_query() | |
except ClientRequestException as e: | |
if e.response and e.response.status_code == 429: | |
logger.warning("Rate limit exceeded, sleeping and retrying query execution") | |
retry_after = e.response.headers.get("Retry-After") | |
if retry_after: | |
time.sleep(int(retry_after)) | |
else: | |
# Default sleep if no retry-after header | |
time.sleep(30) | |
retries += 1 | |
else: | |
raise e | |
raise ClientRequestException("Max retries exceeded") |
print(f"Error loading certificate: {e}") | ||
return None |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
style: avoid print() statement, use logger instead since it's already set up
print(f"Error loading certificate: {e}") | |
return None | |
logger.error(f"Error loading certificate: {e}") | |
return None |
elif hasattr(content, "value") and isinstance(content, bytes): | ||
content_bytes = content |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
logic: condition will never be true - hasattr(content, 'value') and isinstance(content, bytes) is redundant
if private_key: | ||
private_key_content = process_private_key_file(private_key) | ||
else: | ||
private_key_content = None | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
style: We're setting private_key_content to None but then immediately check if it exists. Merge these conditions by removing the else branch.
if private_key: | |
private_key_content = process_private_key_file(private_key) | |
else: | |
private_key_content = None | |
private_key_content = None | |
if private_key: | |
private_key_content = process_private_key_file(private_key) |
lambda: acquire_token_for_rest(msal_app, sp_tenant_domain) | ||
) | ||
else: | ||
raise RuntimeError("MSAL app or tenant domain is not set") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
style: raise ConnectorValidationError instead of RuntimeError for consistency with other credential validation errors
if not _ignore_credential_permissions(DocumentSource(source)): | ||
fetch_ee_implementation_or_noop( | ||
"onyx.db.user_group", "validate_object_creation_for_user", None | ||
)( | ||
db_session=db_session, | ||
user=user, | ||
target_group_ids=groups, | ||
object_is_public=curator_public, | ||
) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
style: This permission validation block is duplicated from the other endpoint. Consider extracting to a helper function.
scopes=["https://graph.microsoft.com/.default"] | ||
) | ||
return token | ||
return token or {} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
logic: return empty dict could cause authentication issues - should raise exception on token failure
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
cubic found 11 issues across 18 files. Review them in cubic.dev
React with 👍 or 👎 to teach cubic. Tag @cubic-dev-ai
to give specific feedback.
curator_public=curator_public, | ||
groups=groups, | ||
name=name, | ||
source=DocumentSource(source), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Casting user-supplied string to DocumentSource enum without validation can raise ValueError and crash with 500; input should be validated and proper HTTP error returned.
db_session: Session = Depends(get_session), | ||
) -> ObjectCreationIdResponse: | ||
credential_data = json.loads(credential_json) | ||
auth_method = credential_data["authentication_method"] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Direct dict key access can raise KeyError when the field is missing, resulting in an unhandled 500 error; use .get()
with validation instead.
auth_method = credential_data["authentication_method"] | |
auth_method = credential_data.get("authentication_method") |
private_key: UploadFile | None = File(None), | ||
db_session: Session = Depends(get_session), | ||
) -> ObjectCreationIdResponse: | ||
credential_data = json.loads(credential_json) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
json.loads is called without error handling, so invalid JSON in the form field will raise a JSONDecodeError and return a 500 Internal Server Error instead of a 4xx client error.
web/src/components/Field.tsx
Outdated
return false; | ||
} | ||
let extension = file.name.split(".").pop(); | ||
if (extension !== "pfx") { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
File-extension check is case-sensitive, so valid files such as certificate.PFX
will be rejected. Convert the extension to lower-case before comparing.
if (extension !== "pfx") { | |
if (extension?.toLowerCase() !== "pfx") { |
web/src/components/Field.tsx
Outdated
}) { | ||
// This component keeps the file as a File object without extracting its content | ||
// Useful for files like private keys that need to be processed on the backend | ||
const [field, meta, helpers] = useField<File | null>(name); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The meta
variable returned by useField
is never used, which will trigger an @typescript-eslint/no-unused-vars
lint error and can break CI linting pipelines.
const [field, meta, helpers] = useField<File | null>(name); | |
const [field, , helpers] = useField<File | null>(name); |
backend/onyx/server/utils.py
Outdated
|
||
|
||
def process_private_key_file(file: UploadFile) -> str: | ||
if file.filename and file.filename.endswith(".pfx"): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
File type validation only checks the file extension, which can be easily spoofed. Consider validating the file content (e.g., checking for PKCS#12 magic bytes) for better security.
) | ||
|
||
return { | ||
"private_key": key_pem, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
key_pem
is a bytes object but MSAL expects the PEM value to be a UTF-8 string; passing bytes will raise a type error during authentication.
"private_key": key_pem, | |
"private_key": key_pem.decode("utf-8"), |
# Handle different content types | ||
if isinstance(content, bytes): | ||
content_bytes = content | ||
elif hasattr(content, "value") and isinstance(content, bytes): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The condition mistakenly checks isinstance(content, bytes)
instead of isinstance(content.value, bytes)
, making the branch unreachable and preventing handling of objects whose data is stored in a .value
attribute.
elif hasattr(content, "value") and isinstance(content, bytes): | |
elif hasattr(content, "value") and isinstance(content.value, bytes): |
"thumbprint": certificate.fingerprint(hashes.SHA1()).hex(), | ||
} | ||
except Exception as e: | ||
print(f"Error loading certificate: {e}") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Using print for error reporting bypasses the project-wide logging system, leading to inconsistent log handling and potential loss of visibility in production.
print(f"Error loading certificate: {e}") | |
logger.error(f"Error loading certificate: {e}") |
.notRequired(); | ||
} else if (def instanceof File) { | ||
// File upload fields with size and extension validation | ||
schemaFields[key] = Yup.mixed() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
File fields inside a multi-auth template are always marked as required, ignoring the selected authentication_method. This forces users to upload files even when they pick a different auth method (e.g., username / password), breaking the form validation.
@cubic-dev-ai review this |
@wenxi-onyx I've started the AI code review. It'll take a few minutes to complete. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
cubic found 5 issues across 23 files. Review them in cubic.dev
React with 👍 or 👎 to teach cubic. Tag @cubic-dev-ai
to give specific feedback.
) | ||
|
||
# Read file content for validation and processing | ||
private_key_bytes = file.file.read() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Entire uploaded file is read into memory without size limits, which can lead to excessive memory usage or DoS when large files are uploaded.
Prompt for AI agents
Address the following comment on backend/onyx/server/documents/private_key_types.py at line 39:
<comment>Entire uploaded file is read into memory without size limits, which can lead to excessive memory usage or DoS when large files are uploaded.</comment>
<file context>
@@ -0,0 +1,57 @@
+import base64
+from enum import Enum
+from typing import Protocol
+
+from fastapi import HTTPException
+from fastapi import UploadFile
+
+from onyx.server.documents.utils import validate_pkcs12_content
+
+
+class ProcessPrivateKeyFileProtocol(Protocol):
+ def __call__(self, file: UploadFile) -> str:
+ """
+ Accepts a file-like object, validates the file (e.g., checks extension and content),
+ and returns its contents as a base64-encoded string if valid.
+ Raises an exception if validation fails.
+ """
+ ...
+
+
+class PrivateKeyFileTypes(Enum):
+ SHAREPOINT_PFX_FILE = "sharepoint_pfx_file"
+
+
+def process_sharepoint_private_key_file(file: UploadFile) -> str:
+ """
+ Process and validate a private key file upload.
+
+ Validates both the file extension and file content to ensure it's a valid PKCS#12 file.
+ Content validation prevents attacks that rely on file extension spoofing.
+ """
+ # First check file extension (basic filter)
+ if not (file.filename and file.filename.endswith(".pfx")):
+ raise HTTPException(
+ status_code=400, detail="Invalid file type. Only .pfx files are supported."
+ )
+
+ # Read file content for validation and processing
+ private_key_bytes = file.file.read()
+
+ # Validate file content to prevent extension spoofing attacks
+ if not validate_pkcs12_content(private_key_bytes):
+ raise HTTPException(
+ status_code=400,
+ detail="Invalid file content. The uploaded file does not appear to be a valid PKCS#12 (.pfx) file.",
+ )
+
+ # Convert to base64 if validation passes
+ pfx_64 = base64.b64encode(private_key_bytes).decode("ascii")
+ return pfx_64
+
+
+FILE_TYPE_TO_FILE_PROCESSOR: dict[
+ PrivateKeyFileTypes, ProcessPrivateKeyFileProtocol
+] = {
+ PrivateKeyFileTypes.SHAREPOINT_PFX_FILE: process_sharepoint_private_key_file,
+}
</file context>
Content validation prevents attacks that rely on file extension spoofing. | ||
""" | ||
# First check file extension (basic filter) | ||
if not (file.filename and file.filename.endswith(".pfx")): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
File extension check is case-sensitive, so valid files like “CERT.PFX” will be rejected.
Prompt for AI agents
Address the following comment on backend/onyx/server/documents/private_key_types.py at line 33:
<comment>File extension check is case-sensitive, so valid files like “CERT.PFX” will be rejected.</comment>
<file context>
@@ -0,0 +1,57 @@
+import base64
+from enum import Enum
+from typing import Protocol
+
+from fastapi import HTTPException
+from fastapi import UploadFile
+
+from onyx.server.documents.utils import validate_pkcs12_content
+
+
+class ProcessPrivateKeyFileProtocol(Protocol):
+ def __call__(self, file: UploadFile) -> str:
+ """
+ Accepts a file-like object, validates the file (e.g., checks extension and content),
+ and returns its contents as a base64-encoded string if valid.
+ Raises an exception if validation fails.
+ """
+ ...
+
+
+class PrivateKeyFileTypes(Enum):
+ SHAREPOINT_PFX_FILE = "sharepoint_pfx_file"
+
+
+def process_sharepoint_private_key_file(file: UploadFile) -> str:
+ """
+ Process and validate a private key file upload.
+
+ Validates both the file extension and file content to ensure it's a valid PKCS#12 file.
+ Content validation prevents attacks that rely on file extension spoofing.
+ """
+ # First check file extension (basic filter)
+ if not (file.filename and file.filename.endswith(".pfx")):
+ raise HTTPException(
+ status_code=400, detail="Invalid file type. Only .pfx files are supported."
+ )
+
+ # Read file content for validation and processing
+ private_key_bytes = file.file.read()
+
+ # Validate file content to prevent extension spoofing attacks
+ if not validate_pkcs12_content(private_key_bytes):
+ raise HTTPException(
+ status_code=400,
+ detail="Invalid file content. The uploaded file does not appear to be a valid PKCS#12 (.pfx) file.",
+ )
+
+ # Convert to base64 if validation passes
+ pfx_64 = base64.b64encode(private_key_bytes).decode("ascii")
+ return pfx_64
+
+
+FILE_TYPE_TO_FILE_PROCESSOR: dict[
+ PrivateKeyFileTypes, ProcessPrivateKeyFileProtocol
+] = {
+ PrivateKeyFileTypes.SHAREPOINT_PFX_FILE: process_sharepoint_private_key_file,
+}
</file context>
if not (file.filename and file.filename.endswith(".pfx")): | |
if not (file.filename and file.filename.lower().endswith(".pfx")): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
+1
Check if the exception indicates a password-related issue rather than a format issue. | ||
""" | ||
error_msg = str(error).lower() | ||
password_keywords = ["mac", "integrity", "password", "authentication", "verify"] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The validation treats any ValueError containing the words "integrity" or "verify" as a password-related error and therefore accepts the file as valid. These terms are generic and can appear in many non-password error messages, allowing a malformed or malicious file to bypass PKCS#12 validation.
Prompt for AI agents
Address the following comment on backend/onyx/server/documents/utils.py at line 13:
<comment>The validation treats any ValueError containing the words "integrity" or "verify" as a password-related error and therefore accepts the file as valid. These terms are generic and can appear in many non-password error messages, allowing a malformed or malicious file to bypass PKCS#12 validation.</comment>
<file context>
@@ -0,0 +1,75 @@
+from cryptography.hazmat.primitives.serialization import pkcs12
+
+from onyx.utils.logger import setup_logger
+
+logger = setup_logger()
+
+
+def _is_password_related_error(error: Exception) -> bool:
+ """
+ Check if the exception indicates a password-related issue rather than a format issue.
+ """
+ error_msg = str(error).lower()
+ password_keywords = ["mac", "integrity", "password", "authentication", "verify"]
+ return any(keyword in error_msg for keyword in password_keywords)
+
+
+def validate_pkcs12_content(file_bytes: bytes) -> bool:
+ """
+ Validate that the file content is actually a PKCS#12 file.
+ This performs basic format validation without requiring passwords.
+ """
+ try:
+ # Basic file size check
+ if len(file_bytes) < 10:
+ logger.debug("File too small to be a valid PKCS#12 file")
+ return False
+
+ # Check for PKCS#12 magic bytes/ASN.1 structure
+ # PKCS#12 files start with ASN.1 SEQUENCE tag (0x30)
+ if file_bytes[0] != 0x30:
+ logger.debug("File does not start with ASN.1 SEQUENCE tag")
+ return False
+
+ # Try to parse the outer ASN.1 structure without password validation
+ # This checks if the file has the basic PKCS#12 structure
+ try:
+ # Attempt to load just to validate the basic format
+ # We expect this to fail due to password, but it should fail with a specific error
+ pkcs12.load_key_and_certificates(file_bytes, password=None)
+ return True
+ except ValueError as e:
+ # Check if the error is related to password (expected) vs format issues
+ if _is_password_related_error(e):
+ # These errors indicate the file format is correct but password is wrong/missing
+ logger.debug(
+ f"PKCS#12 format appears valid, password-related error: {e}"
+ )
+ return True
+ else:
+ # Other ValueError likely indicates format issues
+ logger.debug(f"PKCS#12 format validation failed: {e}")
+ return False
+ except Exception as e:
+ # Try with empty password as fallback
+ try:
+ pkcs12.load_key_and_certificates(file_bytes, password=b"")
+ return True
+ except ValueError as e2:
+ if _is_password_related_error(e2):
+ logger.debug(
+ f"PKCS#12 format appears valid with empty password attempt: {e2}"
+ )
+ return True
+ else:
+ logger.debug(
+ f"PKCS#12 validation failed on both attempts: {e}, {e2}"
+ )
+ return False
+ except Exception:
+ logger.debug(f"PKCS#12 validation failed: {e}")
+ return False
+
+ except Exception as e:
+ logger.debug(f"Unexpected error during PKCS#12 validation: {e}")
+ return False
</file context>
.transform((v, o) => (o === undefined ? false : v)); | ||
} else if (isTypedFileField(key)) { | ||
// TypedFile fields - use mixed schema instead of string (check before null check) | ||
schemaFields[key] = Yup.mixed().required( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This unconditional assignment makes the typed-file field required for all authentication methods, overriding the earlier conditional validation created inside the authMethods loop. In multi-auth templates the field will be required even when the selected authentication method does not use it, preventing valid submissions.
Prompt for AI agents
Address the following comment on web/src/components/credentials/lib.ts at line 71:
<comment>This unconditional assignment makes the typed-file field required for all authentication methods, overriding the earlier conditional validation created inside the authMethods loop. In multi-auth templates the field will be required even when the selected authentication method does not use it, preventing valid submissions.</comment>
<file context>
@@ -6,6 +6,7 @@ import {
getDisplayNameForCredentialKey,
CredentialTemplateWithAuth,
} from "@/lib/connectors/credentials";
+import { isTypedFileField } from "@/lib/connectors/fileTypes";
export function createValidationSchema(json_values: Record<string, any>) {
const schemaFields: Record<string, Yup.AnySchema> = {};
@@ -16,7 +17,6 @@ export function createValidationSchema(json_values: Record<string, any>) {
schemaFields["authentication_method"] = Yup.string().required(
"Please select an authentication method"
);
-
// conditional rules per authMethod
template.authMethods.forEach((method) => {
Object.entries(method.fields).forEach(([key, def]) => {
@@ -26,6 +26,14 @@ export function createValidationSchema(json_values: Record<string, any>) {
.nullable()
.default(false)
.transform((v, o) => (o === undefined ? false : v));
+ } else if (isTypedFileField(key)) {
+ //TypedFile fields - use mixed schema instead of string (check before null check)
+ schemaFields[key] = Yup.mixed().when("authentication_method", {
+ is: method.value,
+ then: () =>
+ Yup.mixed().required(`Please select a ${displayName} file`),
+ otherwise: () => Yup.mixed().notRequired(),
+ });
} else if (def === null) {
schemaFields[key] = Yup.string()
.trim()
@@ -58,6 +66,11 @@ export function createValidationSchema(json_values: Record<string, any>) {
.nullable()
.default(false)
.transform((v, o) => (o === undefined ? false : v));
+ } else if (isTypedFileField(key)) {
+ // TypedFile fields - use mixed schema instead of string (check before null check)
+ schemaFields[key] = Yup.mixed().required(
+ `Please select a ${displayName} file`
+ );
} else if (def === null) {
schemaFields[key] = Yup.string()
.trim()
@@ -77,11 +90,16 @@ export function createValidationSchema(json_values: Record<string, any>) {
}
export function createEditingValidationSchema(json_values: dictionaryType) {
- const schemaFields: { [key: string]: Yup.StringSchema } = {};
+ const schemaFields: { [key: string]: Yup.AnySchema } = {};
for (const key in json_values) {
if (Object.prototype.hasOwnProperty.call(json_values, key)) {
- schemaFields[key] = Yup.string().optional();
+ if (isTypedFileField(key)) {
+ // TypedFile fields - use mixed schema for optional file uploads during editing
+ schemaFields[key] = Yup.mixed().optional();
+ } else {
+ schemaFields[key] = Yup.string().optional();
+ }
}
}
@@ -95,7 +113,12 @@ export function createInitialValues(credential: Credential<any>): formType {
};
for (const key in credential.credential_json) {
- initialValues[key] = "";
+ // Initialize TypedFile fields as null, other fields as empty strings
+ if (isTypedFileField(key)) {
+ initialValues[key] = null as any; // TypedFile fields start as null
+ } else {
+ initialValues[key] = "";
+ }
}
return initialValues;
</file context>
member = assignment.member | ||
|
||
# Check for anonymous users (principal_type 3) | ||
if hasattr(member, "principal_type") and member.principal_type == 3: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
prefer having constants for each principle type + comments that clearly define what each means
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
also, why do we have do this hasattr
everywhere?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We get different response structures based on the principal type, which is why we use hasattr
. However, it's not needed in some places, so I will remove it now.
web = client_context.web | ||
|
||
# Patterns that indicate public access | ||
public_login_patterns: list[str] = [ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
can we link where this is from in a comment? I'm a little scared about this
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Now i changed the approach little bit and added the link
principal_type: int | ||
|
||
|
||
def _sleep_and_retry(query_obj: Any, method_name: str, max_retries: int = 3) -> Any: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
is there a way to do this in a typed way? Obviously, you can pass in any method name and there won't be any static-time checking
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Added a Type for query_obj now
Content validation prevents attacks that rely on file extension spoofing. | ||
""" | ||
# First check file extension (basic filter) | ||
if not (file.filename and file.filename.endswith(".pfx")): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
+1
web/src/lib/credential.ts
Outdated
) { | ||
const formData = new FormData(); | ||
formData.append( | ||
"credential_json", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nit: prefer using constants / having a shared util across both createCredentialWithPrivateKey
and updateCredentialWithPrivateKey
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@subash-onyx bump 👊
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sorry missed the comment 😅. Now I added the constants.
|
||
|
||
@router.post("/credential/private-key") | ||
def create_credential_with_private_key( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
does this need to be a separate endpoint? Could we just add file support to the existing one?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Current endpoint is used in other scripts and tests
Also, looks like we need to rebase. I added |
…omains and remove direct tenant domain input from credentials
98eb58a
to
934e5f9
Compare
…rocessing on unprocessable files
* sharepoint perm sync first draft * feat: Implement SharePoint permission synchronization * mypy fix * remove commented code * bot comments fixes and job failure fixes * introduce generic way to upload certificates in credentials * mypy fix * add checkpoiting to sharepoint connector * add sharepoint integration tests * Refactor SharePoint connector to derive tenant domain from verified domains and remove direct tenant domain input from credentials * address review comments * add permission sync to site pages * mypy fix * fix tests error * fix tests and address comments * Update file extraction behavior in SharePoint connector to continue processing on unprocessable files
* sharepoint perm sync first draft * feat: Implement SharePoint permission synchronization * mypy fix * remove commented code * bot comments fixes and job failure fixes * introduce generic way to upload certificates in credentials * mypy fix * add checkpoiting to sharepoint connector * add sharepoint integration tests * Refactor SharePoint connector to derive tenant domain from verified domains and remove direct tenant domain input from credentials * address review comments * add permission sync to site pages * mypy fix * fix tests error * fix tests and address comments * Update file extraction behavior in SharePoint connector to continue processing on unprocessable files
Description
This PR includes updates related to SharePoint permission sync. It introduces a new authentication method for SharePoint using certificates. Additionally, a new endpoint
(/credential/private-key)
has been added tocredentials.py
to support certificate-based authentication and private key uploads.How Has This Been Tested?
Tested manually by creating the connector.
Backporting (check the box to trigger backport action)
Note: You have to check that the action passes, otherwise resolve the conflicts manually and tag the patches.
Summary by cubic
Added SharePoint permission sync with support for certificate-based authentication and private key upload, enabling more secure and flexible connector setup.
/credential/private-key
API endpoint.