mirror of
https://github.com/EsotericSoftware/spine-runtimes.git
synced 2026-02-04 22:34:53 +08:00
[tests] Auto-generation of Java SkeletonSerializer
This commit is contained in:
parent
73a17e88c9
commit
3183c0b383
@ -1,4 +1,4 @@
|
||||
- SkeletonBinary.cpp is buggy. DataInput sign handling seems to be foo bar, so we get incorrect flags etc. Compare to SkeletonBinary.java, both statically (compare sources and involved classes) and dynamically (instrument both, run on same file, compare where we go wrong)
|
||||
- lsp-cli should export its types, so we can pull them in as a dependency instead of redefining them ourselves in types.ts
|
||||
- clean up logging in spine-c/codegen, use chalk to do colored warnings/errors and make logging look very nice and informative (no emojis)
|
||||
- spine-c/codegen type extractor should also report typedefs like typedef long long PropertyId; so primitive type to some name, and we need to handle that in the codegen
|
||||
- Generate language bindings in spine-c/codegen
|
||||
|
||||
193
docs/todos/work/2025-01-11-03-02-54-test-suite/task.md
Normal file
193
docs/todos/work/2025-01-11-03-02-54-test-suite/task.md
Normal file
@ -0,0 +1,193 @@
|
||||
# create a folder test/ and write a comprehensive test suite
|
||||
|
||||
**Status:** In Progress
|
||||
**Created:** 2025-01-11T03:02:54
|
||||
**Started:** 2025-01-11T03:11:22
|
||||
**Agent PID:** 89579
|
||||
|
||||
## Original Todo
|
||||
- create a folder test/ and write a comprehensive test suite
|
||||
- For each core runtime
|
||||
- Write a program that takes as input a skeleton (.json or .skel) and atlas file path and animation name
|
||||
- Loads a SkeletonData and outputs EVERYTHING in a simple, diffable text format
|
||||
- Creates a skeleton from the SkeletonData, an AnimationState, sets the Animation on track 0, updates and applies the state to the skeleton, then outputs EVERYTHING in a diffable text format
|
||||
- The best approach is likely to create a Printer interface that can print each relevant type in the diffable format, with a specific indentation level, so the output represents the hierarchy of the data
|
||||
- See docs/project-description.md for the core runtimes and their location
|
||||
- The test/ folder should have simple language specific build scripts that build/pull in the core runtime for that language, and create an exectuable program we can invoke
|
||||
- build.gradle for Java, directly pulling in the spine-libgdx project via settings.gradle
|
||||
- CMakeLists.txt for C and C++
|
||||
- package.json/tsconfig.json for Typescript
|
||||
- Let's ignore the other core runtimes for now
|
||||
- The programs must be headless, which means we need to ensure when loading the atlases, texture loading actually doesn't happen.
|
||||
- The goal is to be able to construct bash or nodejs test suites that can find errors in the non-reference runtimes quickly by comparing actually loaded and "applied" data between teh runtimes
|
||||
|
||||
## Description
|
||||
Create a comprehensive test suite that compares the output of all core Spine runtimes (Java, C++, C, TypeScript) to ensure consistency. The test suite will consist of headless command-line programs for each runtime that load skeleton data and output all internal state in a diffable text format. This will enable automated comparison testing between the reference implementation (spine-libgdx) and all ports. Each test program must compile directly against the runtime's source code, not pull published versions from package repositories.
|
||||
|
||||
The test programs will print both SkeletonData (setup pose/static data) and Skeleton (runtime pose after animation) states, including all nested types such as bones, slots, skins, attachments, constraints, and animations.
|
||||
|
||||
## Updated Implementation Plan (DebugPrinter in each runtime)
|
||||
- [x] Remove test/ directory approach (deprecated)
|
||||
- [x] Implement DebugPrinter in spine-libgdx-tests:
|
||||
- [x] Create DebugPrinter.java in spine-libgdx/spine-libgdx-tests/src
|
||||
- [x] Add command line argument parsing for skeleton, atlas, animation
|
||||
- [x] Create Printer class for outputting all types in hierarchical format
|
||||
- [x] Ensure it can run headless (no GL requirements)
|
||||
- [x] Merged Printer and HeadlessTextureLoader into single DebugPrinter.java file
|
||||
- [x] Added gradle task runDebugPrinter for easy execution
|
||||
- [x] Implement DebugPrinter in spine-cpp:
|
||||
- [x] Create spine-cpp/tests directory with DebugPrinter.cpp
|
||||
- [x] Update spine-cpp/CMakeLists.txt with conditional target
|
||||
- [x] Add Printer class matching Java output format
|
||||
- [x] Ensure headless operation
|
||||
- [x] Implement DebugPrinter in spine-c:
|
||||
- [x] Create spine-c/tests directory with debug-printer.c
|
||||
- [x] Update spine-c/CMakeLists.txt with conditional target
|
||||
- [x] Add printer functions matching output format
|
||||
- [x] Ensure headless operation
|
||||
- [x] Fix spine-c/codegen/src/ir-generator.ts to use .buffer() for string getters (currently returns address of temporary String object)
|
||||
- Run generator via: npx tsx spine-c/codegen/src/index.ts
|
||||
- This regenerates spine-c/src/generated/*.cpp files
|
||||
- Fixed by handling "const String &" (with space) in addition to "const String&"
|
||||
- Verified: String methods now properly return .buffer() values (version, hash, etc. display correctly)
|
||||
- [x] Issues found by DebugPrinter comparison:
|
||||
- [x] spine-c API missing file-based skeleton loader that sets name from filename
|
||||
- C++ has readSkeletonDataFile() which sets name, spine-c only exposes content-based loader
|
||||
- Result: SkeletonData name is empty in spine-c output
|
||||
- Fixed: Modified spine_skeleton_data_load_json/binary to accept path parameter and extract name
|
||||
- [x] Coordinate system inconsistency: Java shows scaleY=1.0, C/C++ show scaleY=-1.0, use Bone::setYDown(false) to match Java
|
||||
- [x] Implement DebugPrinter in spine-ts/spine-core:
|
||||
- [x] Create tests/DebugPrinter.ts
|
||||
- [x] Update tsconfig.json to exclude tests/ so tests are not bundled
|
||||
- [x] Add Printer class matching output format
|
||||
- [x] Ensure it runs with npx tsx without build step
|
||||
- [x] Create test runner script (compare-with-reference-impl.ts):
|
||||
- [x] Run each runtime's DebugPrinter with same inputs
|
||||
- [x] Compare outputs and report differences
|
||||
- [x] TypeScript script with shebang for direct execution
|
||||
- [x] Automatically builds C/C++/Java if needed
|
||||
- [x] Saves outputs to tests/output/ directory
|
||||
- [x] Shows line-by-line differences when outputs don't match
|
||||
- [x] Make animation parameter optional in all DebugPrinters:
|
||||
- [x] If animation not provided, call skeleton.setToSetupPose() instead
|
||||
- [x] Update Java DebugPrinter
|
||||
- [x] Update C++ DebugPrinter
|
||||
- [x] Update C DebugPrinter
|
||||
- [x] Update TypeScript DebugPrinter
|
||||
- [x] Update compare-with-reference-impl.ts to handle optional animation
|
||||
- [x] Fix locale issues - all DebugPrinters should use English locale:
|
||||
- [x] Java: Set Locale.US for number formatting
|
||||
- [x] C++: Set locale to "C" or "en_US.UTF-8"
|
||||
- [x] C: Set locale to "C" or "en_US.UTF-8"
|
||||
- [x] TypeScript: Already uses period for decimals
|
||||
- [x] Improve buildCheck() to detect when rebuild needed:
|
||||
- [x] Check if debug printer executable exists
|
||||
- [x] Compare executable timestamp with source file timestamps
|
||||
- [x] Rebuild if any source files are newer than executable
|
||||
- [x] Create tests/README.md documentation:
|
||||
- [x] Explain purpose: comparing reference implementation to other runtimes
|
||||
- [x] List DebugPrinter locations in each runtime
|
||||
- [x] Document how to run individual debug printers
|
||||
- [x] Document how to run compare-with-reference-impl.ts
|
||||
- [x] Automated test: All DebugPrinters produce identical output
|
||||
- Note: Minor expected differences remain:
|
||||
- time field: Java shows 0.016 after update, C/C++ show 0.0
|
||||
- TypeScript: minor precision differences, null vs "" for audioPath
|
||||
- These are implementation details, not bugs
|
||||
- [x] User test: Verify with multiple skeleton files
|
||||
|
||||
## Phase 2: JSON Serializers and HeadlessTest Rename
|
||||
|
||||
### Rename DebugPrinter to HeadlessTest
|
||||
- [x] Rename all DebugPrinter files to HeadlessTest:
|
||||
- [x] Java: DebugPrinter.java → HeadlessTest.java
|
||||
- [x] C++: DebugPrinter.cpp → HeadlessTest.cpp
|
||||
- [x] C: debug-printer.c → headless-test.c
|
||||
- [x] TypeScript: DebugPrinter.ts → HeadlessTest.ts
|
||||
- [x] Update VS Code launch configs to say "headless test ($runtime)":
|
||||
- [x] spine-libgdx/.vscode/launch.json
|
||||
- [x] spine-cpp/.vscode/launch.json
|
||||
- [x] spine-c/.vscode/launch.json
|
||||
- [x] spine-ts/.vscode/launch.json
|
||||
- [x] Rename tests/compare-with-reference-impl.ts to headless-test-runner.ts
|
||||
- [x] Update build files:
|
||||
- [x] CMakeLists.txt for C++ (executable name: headless-test)
|
||||
- [x] CMakeLists.txt for C (executable name: headless-test)
|
||||
- [x] Gradle for Java (task name: runHeadlessTest, main class: HeadlessTest)
|
||||
- [x] Update tests/README.md with new names
|
||||
|
||||
### Implement JSON Serializers in Core Runtimes
|
||||
- [x] Java (spine-libgdx):
|
||||
- [x] Create SkeletonSerializer class in com.esotericsoftware.spine.utils
|
||||
- [x] Implement serializeSkeletonData(SkeletonData, Writer/StringBuilder)
|
||||
- [x] Implement serializeSkeleton(Skeleton, Writer/StringBuilder)
|
||||
- [x] Implement serializeAnimationState(AnimationState, Writer/StringBuilder)
|
||||
- [x] Add depth/verbosity options to control output
|
||||
- [x] Handle circular references and limit nesting
|
||||
- [x] Update HeadlessTest to use SkeletonSerializer
|
||||
- [x] Review serializer with user:
|
||||
- [x] Test with actual skeleton file to see output format
|
||||
- [x] Add cycle detection to handle circular references (outputs "<circular>")
|
||||
- [x] Verify it compiles and produces JSON output
|
||||
- [x] Create comprehensive API analyzer tool:
|
||||
- [x] Analyzer discovers all types accessible via SkeletonData, Skeleton, and AnimationState
|
||||
- [x] For each type, enumerate all getters including inherited ones
|
||||
- [x] Generate Java serializer from analysis data
|
||||
- [x] Handle enums, abstract types, inner classes, and type parameters
|
||||
- [x] Filter out test classes and non-source files
|
||||
- [x] Work on SkeletonSerializer.java generation until it actually compiles.
|
||||
- [ ] C++ (spine-cpp):
|
||||
- [ ] Create SkeletonSerializer.h/cpp in spine-cpp/src/spine
|
||||
- [ ] Implement serializeSkeletonData(SkeletonData*, std::string&)
|
||||
- [ ] Implement serializeSkeleton(Skeleton*, std::string&)
|
||||
- [ ] Implement serializeAnimationState(AnimationState*, std::string&)
|
||||
- [ ] Add SerializerOptions struct for controlling output
|
||||
- [ ] Update HeadlessTest to use SkeletonSerializer
|
||||
- [ ] Ensure serializer outputs exact same data format as Java version
|
||||
- [ ] C (spine-c):
|
||||
- [ ] Create spine-skeleton-serializer.h/c
|
||||
- [ ] Implement spine_skeleton_data_serialize_json(data, buffer, options)
|
||||
- [ ] Implement spine_skeleton_serialize_json(skeleton, buffer, options)
|
||||
- [ ] Implement spine_animation_state_serialize_json(state, buffer, options)
|
||||
- [ ] Add spine_serializer_options struct
|
||||
- [ ] Update headless-test to use serializer functions
|
||||
- [ ] Ensure serializer outputs exact same data format as Java version
|
||||
- [ ] TypeScript (spine-ts):
|
||||
- [ ] Create SkeletonSerializer.ts in spine-core/src
|
||||
- [ ] Implement serializeSkeletonData(data: SkeletonData): object
|
||||
- [ ] Implement serializeSkeleton(skeleton: Skeleton): object
|
||||
- [ ] Implement serializeAnimationState(state: AnimationState): object
|
||||
- [ ] Add SerializerOptions interface
|
||||
- [ ] Update HeadlessTest to use SkeletonSerializer and JSON.stringify
|
||||
- [ ] Ensure serializer outputs exact same data format as Java version
|
||||
- [ ] Update tests/README.md to describe the new setup
|
||||
|
||||
### Misc (added by user while Claude worked, need to be expanded!)
|
||||
- [ ] HeadlessTest should probably
|
||||
- Have a mode that does what we currently do: take files and animation name, and output serialized skeleton data and skeleton. Used for ad-hoc testing of files submitted by users in error reports etc.
|
||||
- Have "unit" test like tests, that are easily extensible
|
||||
- each test has a name and points to the corresponding function to execute
|
||||
- HeadlessTest can take as args a single name, multiple test names, or no args in which case it runs all tests in order
|
||||
- Structure and cli handling needs to be the same in all HeadlessTest implementations
|
||||
- tests/headless-test-runner.ts should also support these same cli args, run each runtime test, then compare outputs.
|
||||
|
||||
### Serializer Design Considerations
|
||||
- Special cases to avoid infinite recursion:
|
||||
- Bone parent references (output name only)
|
||||
- Constraint target references (output names only)
|
||||
- Skin attachment references (limit depth)
|
||||
- Timeline references in animations
|
||||
- Fields to include at each level:
|
||||
- SkeletonData: All top-level fields, list of bone/slot/skin/animation names
|
||||
- Skeleton: Current pose transforms, active skin, color
|
||||
- AnimationState: Active tracks, mix times, current time
|
||||
- Output format: Pretty-printed JSON with 2-space indentation
|
||||
|
||||
## Future Expansion (after serializers complete):
|
||||
- Add full type printing for SkeletonData (bones, slots, skins, animations)
|
||||
- Add Skeleton runtime state printing
|
||||
- Add all attachment types
|
||||
- Add all timeline types
|
||||
- Add all constraint types
|
||||
- Add comprehensive type verification
|
||||
|
||||
@ -38,7 +38,9 @@ import com.badlogic.gdx.graphics.Texture;
|
||||
import com.badlogic.gdx.graphics.g2d.TextureAtlas;
|
||||
import com.badlogic.gdx.graphics.g2d.TextureAtlas.AtlasRegion;
|
||||
import com.badlogic.gdx.graphics.g2d.TextureAtlas.TextureAtlasData;
|
||||
import com.esotericsoftware.spine.utils.SkeletonSerializer;
|
||||
|
||||
import java.io.StringWriter;
|
||||
import java.util.Locale;
|
||||
|
||||
public class HeadlessTest implements ApplicationListener {
|
||||
@ -52,6 +54,8 @@ public class HeadlessTest implements ApplicationListener {
|
||||
this.animationName = animationName;
|
||||
}
|
||||
|
||||
// Removed Printer class - now using SkeletonSerializer
|
||||
/*
|
||||
static class Printer {
|
||||
private int indentLevel = 0;
|
||||
private static final String INDENT = " ";
|
||||
@ -124,6 +128,7 @@ public class HeadlessTest implements ApplicationListener {
|
||||
print("}");
|
||||
}
|
||||
}
|
||||
*/
|
||||
|
||||
// Mock texture that doesn't require OpenGL - similar to AndroidTexture
|
||||
static class MockTexture extends Texture {
|
||||
@ -212,10 +217,14 @@ public class HeadlessTest implements ApplicationListener {
|
||||
skeletonData = binary.readSkeletonData(skeletonFile);
|
||||
}
|
||||
|
||||
// Print skeleton data
|
||||
// Create serializer
|
||||
SkeletonSerializer serializer = new SkeletonSerializer();
|
||||
|
||||
// Print skeleton data as JSON
|
||||
System.out.println("=== SKELETON DATA ===");
|
||||
Printer printer = new Printer();
|
||||
printer.printSkeletonData(skeletonData);
|
||||
StringWriter dataWriter = new StringWriter();
|
||||
serializer.serializeSkeletonData(skeletonData, dataWriter);
|
||||
System.out.println(dataWriter.toString());
|
||||
|
||||
// Create skeleton instance
|
||||
Skeleton skeleton = new Skeleton(skeletonData);
|
||||
@ -242,9 +251,17 @@ public class HeadlessTest implements ApplicationListener {
|
||||
|
||||
skeleton.updateWorldTransform(Physics.update);
|
||||
|
||||
// Print skeleton state
|
||||
// Print skeleton state as JSON
|
||||
System.out.println("\n=== SKELETON STATE ===");
|
||||
printer.printSkeleton(skeleton);
|
||||
StringWriter skeletonWriter = new StringWriter();
|
||||
serializer.serializeSkeleton(skeleton, skeletonWriter);
|
||||
System.out.println(skeletonWriter.toString());
|
||||
|
||||
// Print animation state as JSON
|
||||
System.out.println("\n=== ANIMATION STATE ===");
|
||||
StringWriter stateWriter = new StringWriter();
|
||||
serializer.serializeAnimationState(state, stateWriter);
|
||||
System.out.println(stateWriter.toString());
|
||||
|
||||
} catch (Exception e) {
|
||||
e.printStackTrace();
|
||||
|
||||
@ -106,6 +106,10 @@ public class Animation {
|
||||
this.duration = duration;
|
||||
}
|
||||
|
||||
public IntArray getBones() {
|
||||
return bones;
|
||||
}
|
||||
|
||||
/** Applies the animation's timelines to the specified skeleton.
|
||||
* <p>
|
||||
* See Timeline {@link Timeline#apply(Skeleton, float, float, Array, float, MixBlend, MixDirection, boolean)}.
|
||||
|
||||
File diff suppressed because it is too large
Load Diff
@ -25,6 +25,9 @@ Each runtime has a HeadlessTest program that outputs skeleton data in a standard
|
||||
```bash
|
||||
cd spine-libgdx
|
||||
./gradlew :spine-libgdx-tests:runHeadlessTest -Pargs="<skeleton-path> <atlas-path> [animation-name]"
|
||||
|
||||
# Example with spineboy:
|
||||
./gradlew :spine-libgdx-tests:runHeadlessTest -Pargs="../examples/spineboy/export/spineboy-pro.json ../examples/spineboy/export/spineboy.atlas walk"
|
||||
```
|
||||
|
||||
### C++ (spine-cpp)
|
||||
@ -32,6 +35,9 @@ cd spine-libgdx
|
||||
cd spine-cpp
|
||||
./build.sh # Build if needed
|
||||
./build/headless-test <skeleton-path> <atlas-path> [animation-name]
|
||||
|
||||
# Example with spineboy:
|
||||
./build/headless-test ../examples/spineboy/export/spineboy-pro.json ../examples/spineboy/export/spineboy.atlas walk
|
||||
```
|
||||
|
||||
### C (spine-c)
|
||||
@ -39,12 +45,18 @@ cd spine-cpp
|
||||
cd spine-c
|
||||
./build.sh # Build if needed
|
||||
./build/headless-test <skeleton-path> <atlas-path> [animation-name]
|
||||
|
||||
# Example with spineboy:
|
||||
./build/headless-test ../examples/spineboy/export/spineboy-pro.json ../examples/spineboy/export/spineboy.atlas walk
|
||||
```
|
||||
|
||||
### TypeScript (spine-ts)
|
||||
```bash
|
||||
cd spine-ts/spine-core
|
||||
npx tsx tests/HeadlessTest.ts <skeleton-path> <atlas-path> [animation-name]
|
||||
|
||||
# Example with spineboy:
|
||||
npx tsx tests/HeadlessTest.ts ../../examples/spineboy/export/spineboy-pro.json ../../examples/spineboy/export/spineboy.atlas walk
|
||||
```
|
||||
|
||||
## Running the Comparison Test
|
||||
@ -82,12 +94,41 @@ This script will:
|
||||
Each HeadlessTest outputs:
|
||||
- **SKELETON DATA**: Static setup pose data (bones, slots, skins, animations metadata)
|
||||
- **SKELETON STATE**: Runtime state after applying animations
|
||||
- **ANIMATION STATE**: Current animation state with tracks and mixing information
|
||||
|
||||
The output uses consistent formatting:
|
||||
The output uses consistent JSON formatting:
|
||||
- Hierarchical structure with 2-space indentation
|
||||
- Float values formatted to 6 decimal places
|
||||
- Strings quoted, nulls explicitly shown
|
||||
- Locale-independent number formatting (always uses `.` for decimals)
|
||||
- Circular references marked as `"<circular>"` to prevent infinite recursion
|
||||
- Each object includes a `"type"` field for easy identification
|
||||
|
||||
## Development Tools
|
||||
|
||||
### API Analyzer (Java)
|
||||
Analyzes the spine-libgdx API to discover all types and their properties:
|
||||
```bash
|
||||
cd tests
|
||||
npx tsx analyze-java-api.ts
|
||||
# Output: output/analysis-result.json
|
||||
```
|
||||
|
||||
### Serializer Generator (Java)
|
||||
Generates SkeletonSerializer.java from the analysis:
|
||||
```bash
|
||||
cd tests
|
||||
npx tsx generate-java-serializer.ts
|
||||
# Output: ../spine-libgdx/spine-libgdx/src/com/esotericsoftware/spine/utils/SkeletonSerializer.java
|
||||
```
|
||||
|
||||
### Claude Prompt Generator
|
||||
Generates a prompt for Claude to help port the serializer to other runtimes:
|
||||
```bash
|
||||
cd tests
|
||||
npx tsx generate-claude-prompt.ts
|
||||
# Output: output/port-serializer-prompt.txt
|
||||
```
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
@ -99,13 +140,3 @@ If outputs differ between runtimes:
|
||||
- Missing or extra fields in data structures
|
||||
- Different default values
|
||||
- Rounding differences
|
||||
|
||||
## Future Expansion
|
||||
|
||||
The current implementation prints basic skeleton data. Future expansions will include:
|
||||
- Full bone and slot hierarchies
|
||||
- All attachment types
|
||||
- Animation timelines
|
||||
- Constraint data
|
||||
- Physics settings
|
||||
- Complete runtime state after animation
|
||||
665
tests/analyze-java-api.ts
Executable file
665
tests/analyze-java-api.ts
Executable file
@ -0,0 +1,665 @@
|
||||
#!/usr/bin/env tsx
|
||||
|
||||
import { execSync } from 'child_process';
|
||||
import * as fs from 'fs';
|
||||
import * as path from 'path';
|
||||
import type { Symbol, LspOutput, ClassInfo, PropertyInfo, AnalysisResult } from './types';
|
||||
|
||||
function ensureOutputDir(): string {
|
||||
const outputDir = path.join(process.cwd(), 'output');
|
||||
if (!fs.existsSync(outputDir)) {
|
||||
fs.mkdirSync(outputDir, { recursive: true });
|
||||
}
|
||||
return outputDir;
|
||||
}
|
||||
|
||||
function generateLspData(outputDir: string): string {
|
||||
const outputFile = path.join(outputDir, 'spine-libgdx-symbols.json');
|
||||
const projectDir = '/Users/badlogic/workspaces/spine-runtimes/spine-libgdx';
|
||||
const srcDir = path.join(projectDir, 'spine-libgdx/src');
|
||||
|
||||
// Check if we need to regenerate
|
||||
let needsRegeneration = true;
|
||||
if (fs.existsSync(outputFile)) {
|
||||
const outputStats = fs.statSync(outputFile);
|
||||
const outputTime = outputStats.mtime.getTime();
|
||||
|
||||
// Find the newest source file
|
||||
const newestSourceTime = execSync(
|
||||
`find "${srcDir}" -name "*.java" -type f ! -name "SkeletonSerializer.java" -exec stat -f "%m" {} \\; | sort -nr | head -1`,
|
||||
{ encoding: 'utf8' }
|
||||
).trim();
|
||||
|
||||
if (newestSourceTime) {
|
||||
const sourceTime = parseInt(newestSourceTime) * 1000; // Convert to milliseconds
|
||||
needsRegeneration = sourceTime > outputTime;
|
||||
}
|
||||
}
|
||||
|
||||
if (needsRegeneration) {
|
||||
console.error('Generating LSP data for spine-libgdx...');
|
||||
try {
|
||||
execSync(`npx lsp-cli "${projectDir}" java "${outputFile}"`, {
|
||||
encoding: 'utf8',
|
||||
stdio: ['ignore', 'ignore', 'pipe'] // Hide stdout but show stderr
|
||||
});
|
||||
console.error('LSP data generated successfully');
|
||||
} catch (error: any) {
|
||||
console.error('Error generating LSP data:', error.message);
|
||||
throw error;
|
||||
}
|
||||
} else {
|
||||
console.error('Using existing LSP data (up to date)');
|
||||
}
|
||||
|
||||
return outputFile;
|
||||
}
|
||||
|
||||
function analyzeClasses(symbols: Symbol[]): Map<string, ClassInfo> {
|
||||
const classMap = new Map<string, ClassInfo>();
|
||||
const srcPath = '/Users/badlogic/workspaces/spine-runtimes/spine-libgdx/spine-libgdx/src/';
|
||||
|
||||
function processSymbol(symbol: Symbol, parentName?: string) {
|
||||
if (symbol.kind !== 'class' && symbol.kind !== 'enum' && symbol.kind !== 'interface') return;
|
||||
|
||||
// Filter: only process symbols in spine-libgdx/src, excluding SkeletonSerializer
|
||||
if (!symbol.file.startsWith(srcPath)) return;
|
||||
if (symbol.file.endsWith('SkeletonSerializer.java')) return;
|
||||
|
||||
const className = parentName ? `${parentName}.${symbol.name}` : symbol.name;
|
||||
|
||||
const classInfo: ClassInfo = {
|
||||
className: className,
|
||||
superTypes: (symbol.supertypes || []).map(st => st.name.replace('$', '.')),
|
||||
superTypeDetails: symbol.supertypes,
|
||||
file: symbol.file,
|
||||
getters: [],
|
||||
fields: [],
|
||||
isAbstract: false,
|
||||
isInterface: symbol.kind === 'interface',
|
||||
isEnum: symbol.kind === 'enum',
|
||||
typeParameters: symbol.typeParameters || []
|
||||
};
|
||||
|
||||
// No need to parse superTypes from preview anymore - lsp-cli handles this properly now
|
||||
|
||||
// Check if abstract class
|
||||
if (symbol.preview && symbol.preview.includes('abstract ')) {
|
||||
classInfo.isAbstract = true;
|
||||
}
|
||||
|
||||
// Log type parameter information if available
|
||||
if (symbol.typeParameters && symbol.typeParameters.length > 0) {
|
||||
console.error(`Class ${className} has type parameters: ${symbol.typeParameters.join(', ')}`);
|
||||
}
|
||||
if (symbol.supertypes) {
|
||||
for (const supertype of symbol.supertypes) {
|
||||
if (supertype.typeArguments && supertype.typeArguments.length > 0) {
|
||||
console.error(` extends ${supertype.name}<${supertype.typeArguments.join(', ')}>`);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Find all getter methods, public fields, inner classes, and enum values
|
||||
if (symbol.children) {
|
||||
for (const child of symbol.children) {
|
||||
if (child.kind === 'class' || child.kind === 'enum' || child.kind === 'interface') {
|
||||
// Process inner class
|
||||
processSymbol(child, className);
|
||||
} else if (child.kind === 'enumMember') {
|
||||
// Collect enum values
|
||||
if (!classInfo.enumValues) {
|
||||
classInfo.enumValues = [];
|
||||
}
|
||||
classInfo.enumValues.push(child.name);
|
||||
} else if (child.kind === 'field' && child.preview) {
|
||||
// Check if it's a public field
|
||||
if (child.preview.includes('public ')) {
|
||||
// Extract field type from preview
|
||||
// Examples: "public float offset;", "public final Array<ToProperty> to = ..."
|
||||
const fieldMatch = child.preview.match(/public\s+(final\s+)?(.+?)\s+(\w+)\s*[;=]/);
|
||||
if (fieldMatch) {
|
||||
const isFinal = !!fieldMatch[1];
|
||||
const fieldType = fieldMatch[2].trim();
|
||||
const fieldName = fieldMatch[3];
|
||||
classInfo.fields.push({ fieldName, fieldType, isFinal });
|
||||
}
|
||||
}
|
||||
} else if (child.kind === 'method' &&
|
||||
child.name.startsWith('get') &&
|
||||
child.name !== 'getClass()' &&
|
||||
child.name.endsWith('()')) { // Only parameterless getters
|
||||
|
||||
const methodName = child.name.slice(0, -2); // Remove ()
|
||||
|
||||
if (methodName.length > 3 && methodName[3] === methodName[3].toUpperCase()) {
|
||||
// Extract return type from preview
|
||||
let returnType = 'unknown';
|
||||
if (child.preview) {
|
||||
const returnMatch = child.preview.match(/(?:public|protected|private)?\s*(.+?)\s+\w+\s*\(\s*\)/);
|
||||
if (returnMatch) {
|
||||
returnType = returnMatch[1].trim();
|
||||
}
|
||||
}
|
||||
|
||||
classInfo.getters.push({ methodName, returnType });
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
classMap.set(className, classInfo);
|
||||
}
|
||||
|
||||
for (const symbol of symbols) {
|
||||
processSymbol(symbol);
|
||||
}
|
||||
|
||||
return classMap;
|
||||
}
|
||||
|
||||
function findAccessibleTypes(
|
||||
classMap: Map<string, ClassInfo>,
|
||||
startingTypes: string[]
|
||||
): Set<string> {
|
||||
const accessible = new Set<string>();
|
||||
const toVisit = [...startingTypes];
|
||||
const visited = new Set<string>();
|
||||
|
||||
// Helper to find all concrete subclasses of a type
|
||||
function findConcreteSubclasses(typeName: string, addToQueue: boolean = true): string[] {
|
||||
const concreteClasses: string[] = [];
|
||||
|
||||
if (!classMap.has(typeName)) return concreteClasses;
|
||||
|
||||
const classInfo = classMap.get(typeName)!;
|
||||
|
||||
// Add the type itself if it's concrete
|
||||
if (!classInfo.isAbstract && !classInfo.isInterface && !classInfo.isEnum) {
|
||||
concreteClasses.push(typeName);
|
||||
}
|
||||
|
||||
// Find all subclasses recursively
|
||||
for (const [className, info] of classMap) {
|
||||
// Check if this class extends our target (handle both qualified and unqualified names)
|
||||
const extendsTarget = info.superTypes.some(st =>
|
||||
st === typeName ||
|
||||
st === typeName.split('.').pop() ||
|
||||
(typeName.includes('.') && className.startsWith(typeName.split('.')[0] + '.') && st === typeName.split('.').pop())
|
||||
);
|
||||
|
||||
if (extendsTarget) {
|
||||
// Recursively find concrete subclasses
|
||||
const subclasses = findConcreteSubclasses(className, false);
|
||||
concreteClasses.push(...subclasses);
|
||||
|
||||
if (addToQueue && !visited.has(className)) {
|
||||
toVisit.push(className);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
return concreteClasses;
|
||||
}
|
||||
|
||||
while (toVisit.length > 0) {
|
||||
const typeName = toVisit.pop()!;
|
||||
|
||||
if (visited.has(typeName)) continue;
|
||||
visited.add(typeName);
|
||||
|
||||
if (!classMap.has(typeName)) {
|
||||
console.error(`Type ${typeName} not found in classMap`);
|
||||
continue;
|
||||
}
|
||||
|
||||
const classInfo = classMap.get(typeName)!;
|
||||
|
||||
// Add the type itself if it's concrete
|
||||
if (!classInfo.isAbstract && !classInfo.isInterface && !classInfo.isEnum) {
|
||||
accessible.add(typeName);
|
||||
console.error(`Added concrete type: ${typeName}`);
|
||||
}
|
||||
|
||||
// Find all concrete subclasses of this type
|
||||
const concreteClasses = findConcreteSubclasses(typeName);
|
||||
concreteClasses.forEach(c => accessible.add(c));
|
||||
|
||||
// Add types from getter return types and field types
|
||||
const allTypes = [
|
||||
...classInfo.getters.map(g => g.returnType),
|
||||
...classInfo.fields.map(f => f.fieldType)
|
||||
];
|
||||
|
||||
for (const type of allTypes) {
|
||||
const returnType = type
|
||||
.replace(/@Null\s+/g, '') // Remove @Null annotations
|
||||
.replace(/\s+/g, ' '); // Normalize whitespace
|
||||
|
||||
// Extract types from Array<Type>, IntArray, FloatArray, etc.
|
||||
const arrayMatch = returnType.match(/Array<(.+?)>/);
|
||||
if (arrayMatch) {
|
||||
const innerType = arrayMatch[1].trim();
|
||||
// Handle inner classes like AnimationState.TrackEntry
|
||||
if (innerType.includes('.')) {
|
||||
if (classMap.has(innerType) && !visited.has(innerType)) {
|
||||
toVisit.push(innerType);
|
||||
}
|
||||
} else {
|
||||
// Try both plain type and as inner class of current type
|
||||
if (classMap.has(innerType) && !visited.has(innerType)) {
|
||||
toVisit.push(innerType);
|
||||
}
|
||||
// Also try as inner class of the declaring type
|
||||
const parts = typeName.split('.');
|
||||
for (let i = parts.length; i >= 1; i--) {
|
||||
const parentPath = parts.slice(0, i).join('.');
|
||||
const innerClassPath = `${parentPath}.${innerType}`;
|
||||
if (classMap.has(innerClassPath) && !visited.has(innerClassPath)) {
|
||||
toVisit.push(innerClassPath);
|
||||
break;
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Extract all capitalized type names
|
||||
const typeMatches = returnType.match(/\b([A-Z]\w+(?:\.[A-Z]\w+)*)\b/g);
|
||||
if (typeMatches) {
|
||||
for (const match of typeMatches) {
|
||||
if (match === 'BoneLocal') {
|
||||
console.error(`Found BoneLocal in return type of ${typeName}`);
|
||||
}
|
||||
if (classMap.has(match) && !visited.has(match)) {
|
||||
toVisit.push(match);
|
||||
if (match === 'BoneLocal') {
|
||||
console.error(`Added BoneLocal to toVisit`);
|
||||
}
|
||||
}
|
||||
// For non-qualified names, also try as inner class
|
||||
if (!match.includes('.')) {
|
||||
// Try as inner class of current type and its parents
|
||||
const parts = typeName.split('.');
|
||||
for (let i = parts.length; i >= 1; i--) {
|
||||
const parentPath = parts.slice(0, i).join('.');
|
||||
const innerClassPath = `${parentPath}.${match}`;
|
||||
if (classMap.has(innerClassPath) && !visited.has(innerClassPath)) {
|
||||
toVisit.push(innerClassPath);
|
||||
break;
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
console.error(`Found ${accessible.size} accessible types`);
|
||||
return accessible;
|
||||
}
|
||||
|
||||
function getAllProperties(classMap: Map<string, ClassInfo>, className: string, symbolsFile: string): PropertyInfo[] {
|
||||
const allProperties: PropertyInfo[] = [];
|
||||
const visited = new Set<string>();
|
||||
const classInfo = classMap.get(className);
|
||||
if (!classInfo) return [];
|
||||
|
||||
// Build type parameter mapping based on supertype details
|
||||
const typeParamMap = new Map<string, string>();
|
||||
|
||||
// Helper to build parameter mappings for a specific supertype
|
||||
function buildTypeParamMapping(currentClass: string, targetSupertype: string): Map<string, string> {
|
||||
const mapping = new Map<string, string>();
|
||||
const currentInfo = classMap.get(currentClass);
|
||||
if (!currentInfo || !currentInfo.superTypeDetails) return mapping;
|
||||
|
||||
// Find the matching supertype
|
||||
for (const supertype of currentInfo.superTypeDetails) {
|
||||
if (supertype.name === targetSupertype && supertype.typeArguments) {
|
||||
// Get the supertype's class info to know its type parameters
|
||||
const supertypeInfo = classMap.get(targetSupertype);
|
||||
if (supertypeInfo && supertypeInfo.typeParameters) {
|
||||
// Map type parameters to arguments
|
||||
for (let i = 0; i < Math.min(supertypeInfo.typeParameters.length, supertype.typeArguments.length); i++) {
|
||||
mapping.set(supertypeInfo.typeParameters[i], supertype.typeArguments[i]);
|
||||
}
|
||||
}
|
||||
break;
|
||||
}
|
||||
}
|
||||
return mapping;
|
||||
}
|
||||
|
||||
function resolveType(type: string, typeMap: Map<string, string> = new Map()): string {
|
||||
// Resolve generic type parameters
|
||||
if (typeMap.has(type)) {
|
||||
return typeMap.get(type)!;
|
||||
}
|
||||
// TODO: Handle complex types like Array<T>, Map<K, V>, etc.
|
||||
return type;
|
||||
}
|
||||
|
||||
// Collect properties in inheritance order (most specific first)
|
||||
function collectProperties(currentClass: string, inheritanceLevel: number = 0, currentTypeMap: Map<string, string> = new Map()) {
|
||||
if (visited.has(currentClass)) return;
|
||||
visited.add(currentClass);
|
||||
|
||||
const classInfo = classMap.get(currentClass);
|
||||
if (!classInfo) return;
|
||||
|
||||
// Add this class's getters with resolved types
|
||||
for (const getter of classInfo.getters) {
|
||||
allProperties.push({
|
||||
name: getter.methodName + '()',
|
||||
type: resolveType(getter.returnType, currentTypeMap),
|
||||
isGetter: true,
|
||||
inheritedFrom: inheritanceLevel === 0 ? undefined : currentClass
|
||||
});
|
||||
}
|
||||
|
||||
// Add this class's public fields
|
||||
for (const field of classInfo.fields) {
|
||||
allProperties.push({
|
||||
name: field.fieldName,
|
||||
type: resolveType(field.fieldType, currentTypeMap),
|
||||
isGetter: false,
|
||||
inheritedFrom: inheritanceLevel === 0 ? undefined : currentClass
|
||||
});
|
||||
}
|
||||
|
||||
// Recursively collect from supertypes
|
||||
for (const superType of classInfo.superTypes) {
|
||||
// Build type parameter mapping for this supertype
|
||||
const supertypeMapping = buildTypeParamMapping(currentClass, superType);
|
||||
|
||||
// Compose mappings - resolve type arguments through current mapping
|
||||
const composedMapping = new Map<string, string>();
|
||||
for (const [param, arg] of supertypeMapping) {
|
||||
composedMapping.set(param, resolveType(arg, currentTypeMap));
|
||||
}
|
||||
|
||||
// Try to find the supertype - it might be unqualified
|
||||
let superClassInfo = classMap.get(superType);
|
||||
|
||||
// If not found and it's unqualified, try to find it as an inner class
|
||||
if (!superClassInfo && !superType.includes('.')) {
|
||||
// Try as inner class of the same parent
|
||||
if (currentClass.includes('.')) {
|
||||
const parentPrefix = currentClass.substring(0, currentClass.lastIndexOf('.'));
|
||||
const qualifiedSuper = `${parentPrefix}.${superType}`;
|
||||
superClassInfo = classMap.get(qualifiedSuper);
|
||||
if (superClassInfo) {
|
||||
collectProperties(qualifiedSuper, inheritanceLevel + 1, composedMapping);
|
||||
continue;
|
||||
}
|
||||
}
|
||||
|
||||
// Try as top-level class
|
||||
for (const [name, info] of classMap) {
|
||||
if (name === superType || name.endsWith(`.${superType}`)) {
|
||||
collectProperties(name, inheritanceLevel + 1, composedMapping);
|
||||
break;
|
||||
}
|
||||
}
|
||||
} else if (superClassInfo) {
|
||||
collectProperties(superType, inheritanceLevel + 1, composedMapping);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
collectProperties(className);
|
||||
|
||||
// Remove duplicates (overridden methods/shadowed fields), keeping the most specific one
|
||||
const seen = new Map<string, PropertyInfo>();
|
||||
for (const prop of allProperties) {
|
||||
const key = prop.isGetter ? prop.name : `field:${prop.name}`;
|
||||
if (!seen.has(key)) {
|
||||
seen.set(key, prop);
|
||||
}
|
||||
}
|
||||
|
||||
return Array.from(seen.values());
|
||||
}
|
||||
|
||||
// Helper to find all implementations of a type (both concrete and abstract)
|
||||
function findAllImplementations(classMap: Map<string, ClassInfo>, typeName: string, concreteOnly: boolean = false): string[] {
|
||||
const implementations: string[] = [];
|
||||
const visited = new Set<string>();
|
||||
|
||||
function findImplementations(currentType: string) {
|
||||
if (visited.has(currentType)) return;
|
||||
visited.add(currentType);
|
||||
|
||||
// Get the short name for comparison
|
||||
const currentShortName = currentType.split('.').pop()!;
|
||||
const currentPrefix = currentType.includes('.') ? currentType.split('.')[0] : '';
|
||||
|
||||
for (const [className, classInfo] of classMap) {
|
||||
// Check if this class extends/implements the current type
|
||||
let extendsType = false;
|
||||
|
||||
// For inner classes, we need to check if they're in the same outer class
|
||||
if (currentPrefix && className.startsWith(currentPrefix + '.')) {
|
||||
// Both are inner classes of the same outer class
|
||||
extendsType = classInfo.superTypes.some(st =>
|
||||
st === currentShortName || st === currentType
|
||||
);
|
||||
} else {
|
||||
// Standard inheritance check
|
||||
extendsType = classInfo.superTypes.some(st =>
|
||||
st === currentType || st === currentShortName
|
||||
);
|
||||
}
|
||||
|
||||
if (extendsType) {
|
||||
if (!classInfo.isAbstract && !classInfo.isInterface && !classInfo.isEnum) {
|
||||
// This is a concrete implementation
|
||||
implementations.push(className);
|
||||
} else {
|
||||
// This is abstract/interface
|
||||
if (!concreteOnly) {
|
||||
// Include abstract types when getting all implementations
|
||||
implementations.push(className);
|
||||
}
|
||||
// Always recurse to find further implementations
|
||||
findImplementations(className);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
findImplementations(typeName);
|
||||
return [...new Set(implementations)].sort(); // Remove duplicates and sort
|
||||
}
|
||||
|
||||
function analyzeForSerialization(classMap: Map<string, ClassInfo>, symbolsFile: string): AnalysisResult {
|
||||
const startingTypes = ['SkeletonData', 'Skeleton', 'AnimationState'];
|
||||
const accessibleTypes = findAccessibleTypes(classMap, startingTypes);
|
||||
|
||||
// First pass: populate implementations for all abstract types
|
||||
for (const [className, classInfo] of classMap) {
|
||||
if (classInfo.isAbstract || classInfo.isInterface) {
|
||||
// Get only concrete implementations
|
||||
const concreteImplementations = findAllImplementations(classMap, className, true);
|
||||
classInfo.concreteImplementations = concreteImplementations;
|
||||
|
||||
// Get all implementations (including intermediate abstract types)
|
||||
const allImplementations = findAllImplementations(classMap, className, false);
|
||||
classInfo.allImplementations = allImplementations;
|
||||
}
|
||||
}
|
||||
|
||||
// Collect abstract types and their implementations
|
||||
const abstractTypes = new Map<string, string[]>();
|
||||
const allTypesToGenerate = new Set<string>(accessibleTypes);
|
||||
|
||||
// Find all abstract types referenced by accessible types
|
||||
for (const typeName of accessibleTypes) {
|
||||
const classInfo = classMap.get(typeName);
|
||||
if (!classInfo) continue;
|
||||
|
||||
// Check return types and field types for abstract classes
|
||||
const allTypes = [
|
||||
...classInfo.getters.map(g => g.returnType),
|
||||
...classInfo.fields.map(f => f.fieldType)
|
||||
];
|
||||
|
||||
for (const type of allTypes) {
|
||||
const returnType = type
|
||||
.replace(/@Null\s+/g, '')
|
||||
.replace(/\s+/g, ' ');
|
||||
|
||||
// Extract types from Array<Type>
|
||||
let checkTypes: string[] = [];
|
||||
const arrayMatch = returnType.match(/Array<(.+?)>/);
|
||||
if (arrayMatch) {
|
||||
checkTypes.push(arrayMatch[1].trim());
|
||||
} else if (returnType.match(/^[A-Z]\w+$/)) {
|
||||
checkTypes.push(returnType);
|
||||
}
|
||||
|
||||
// Also check for type names that might be inner classes
|
||||
const typeMatches = returnType.match(/\b([A-Z]\w+)\b/g);
|
||||
if (typeMatches) {
|
||||
for (const match of typeMatches) {
|
||||
// Try as inner class of current type
|
||||
const parts = typeName.split('.');
|
||||
for (let i = parts.length; i >= 1; i--) {
|
||||
const parentPath = parts.slice(0, i).join('.');
|
||||
const innerClassPath = `${parentPath}.${match}`;
|
||||
if (classMap.has(innerClassPath)) {
|
||||
checkTypes.push(innerClassPath);
|
||||
break;
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
for (const checkType of checkTypes) {
|
||||
if (checkType && classMap.has(checkType)) {
|
||||
const typeInfo = classMap.get(checkType)!;
|
||||
if (typeInfo.isAbstract || typeInfo.isInterface) {
|
||||
// Use the already populated concreteImplementations
|
||||
const implementations = typeInfo.concreteImplementations || [];
|
||||
abstractTypes.set(checkType, implementations);
|
||||
|
||||
// Add all concrete implementations to types to generate
|
||||
implementations.forEach(impl => allTypesToGenerate.add(impl));
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Collect all properties for each type (including inherited ones)
|
||||
const typeProperties = new Map<string, PropertyInfo[]>();
|
||||
for (const typeName of allTypesToGenerate) {
|
||||
const props = getAllProperties(classMap, typeName, symbolsFile);
|
||||
typeProperties.set(typeName, props);
|
||||
}
|
||||
|
||||
// Also collect properties for abstract types (so we know what properties their implementations should have)
|
||||
for (const abstractType of abstractTypes.keys()) {
|
||||
if (!typeProperties.has(abstractType)) {
|
||||
const props = getAllProperties(classMap, abstractType, symbolsFile);
|
||||
typeProperties.set(abstractType, props);
|
||||
}
|
||||
}
|
||||
|
||||
// Second pass: find additional concrete types referenced in properties
|
||||
const additionalTypes = new Set<string>();
|
||||
for (const [typeName, props] of typeProperties) {
|
||||
for (const prop of props) {
|
||||
const propType = prop.type.replace(/@Null\s+/g, '').trim();
|
||||
|
||||
// Check if it's a simple type name
|
||||
const typeMatch = propType.match(/^([A-Z]\w+)$/);
|
||||
if (typeMatch) {
|
||||
const type = typeMatch[1];
|
||||
if (classMap.has(type)) {
|
||||
const typeInfo = classMap.get(type)!;
|
||||
if (!typeInfo.isAbstract && !typeInfo.isInterface && !typeInfo.isEnum) {
|
||||
if (!allTypesToGenerate.has(type)) {
|
||||
additionalTypes.add(type);
|
||||
console.error(`Found additional type ${type} from property ${prop.name} of ${typeName}`);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Add the additional types
|
||||
additionalTypes.forEach(type => allTypesToGenerate.add(type));
|
||||
|
||||
// Get properties for the additional types too
|
||||
for (const typeName of additionalTypes) {
|
||||
const props = getAllProperties(classMap, typeName, symbolsFile);
|
||||
typeProperties.set(typeName, props);
|
||||
}
|
||||
|
||||
return {
|
||||
classMap,
|
||||
accessibleTypes,
|
||||
abstractTypes,
|
||||
allTypesToGenerate,
|
||||
typeProperties
|
||||
};
|
||||
}
|
||||
|
||||
async function main() {
|
||||
try {
|
||||
// Ensure output directory exists
|
||||
const outputDir = ensureOutputDir();
|
||||
|
||||
// Generate LSP data
|
||||
const jsonFile = generateLspData(outputDir);
|
||||
|
||||
// Read and parse the JSON
|
||||
const jsonContent = fs.readFileSync(jsonFile, 'utf8');
|
||||
const lspData: LspOutput = JSON.parse(jsonContent);
|
||||
|
||||
console.error(`Analyzing ${lspData.symbols.length} symbols...`);
|
||||
|
||||
// Analyze all classes
|
||||
const classMap = analyzeClasses(lspData.symbols);
|
||||
console.error(`Found ${classMap.size} classes`);
|
||||
|
||||
// Perform serialization analysis
|
||||
const analysisResult = analyzeForSerialization(classMap, jsonFile);
|
||||
console.error(`Found ${analysisResult.accessibleTypes.size} accessible types`);
|
||||
console.error(`Found ${analysisResult.allTypesToGenerate.size} types to generate`);
|
||||
|
||||
// Save analysis result to file
|
||||
const analysisFile = path.join(outputDir, 'analysis-result.json');
|
||||
|
||||
// Convert Maps to arrays and handle nested Maps in ClassInfo
|
||||
const classMapArray: [string, any][] = [];
|
||||
for (const [name, info] of analysisResult.classMap) {
|
||||
const serializedInfo = {
|
||||
...info,
|
||||
typeParameters: info.typeParameters ? Array.from(info.typeParameters.entries()) : undefined
|
||||
};
|
||||
classMapArray.push([name, serializedInfo]);
|
||||
}
|
||||
|
||||
const resultToSave = {
|
||||
...analysisResult,
|
||||
// Convert Maps and Sets to arrays for JSON serialization
|
||||
classMap: classMapArray,
|
||||
accessibleTypes: Array.from(analysisResult.accessibleTypes),
|
||||
abstractTypes: Array.from(analysisResult.abstractTypes.entries()),
|
||||
allTypesToGenerate: Array.from(analysisResult.allTypesToGenerate),
|
||||
typeProperties: Array.from(analysisResult.typeProperties.entries())
|
||||
};
|
||||
|
||||
fs.writeFileSync(analysisFile, JSON.stringify(resultToSave, null, 2));
|
||||
console.log(`Analysis result written to: ${analysisFile}`);
|
||||
|
||||
} catch (error: any) {
|
||||
console.error('Error:', error.message);
|
||||
process.exit(1);
|
||||
}
|
||||
}
|
||||
|
||||
main();
|
||||
165
tests/generate-claude-prompt.ts
Normal file
165
tests/generate-claude-prompt.ts
Normal file
@ -0,0 +1,165 @@
|
||||
#!/usr/bin/env tsx
|
||||
|
||||
import * as fs from 'fs';
|
||||
import * as path from 'path';
|
||||
import type { ClassInfo, PropertyInfo } from './types';
|
||||
|
||||
interface SerializedAnalysisResult {
|
||||
classMap: [string, ClassInfo][];
|
||||
accessibleTypes: string[];
|
||||
abstractTypes: [string, string[]][];
|
||||
allTypesToGenerate: string[];
|
||||
typeProperties: [string, PropertyInfo[]][];
|
||||
}
|
||||
|
||||
function generateClaudePrompt(analysisData: SerializedAnalysisResult): string {
|
||||
const output: string[] = [];
|
||||
|
||||
// Convert arrays back to Maps
|
||||
const classMap = new Map(analysisData.classMap);
|
||||
const abstractTypes = new Map(analysisData.abstractTypes);
|
||||
const typeProperties = new Map(analysisData.typeProperties);
|
||||
|
||||
output.push('# Spine Java Serialization Methods to Generate');
|
||||
output.push('');
|
||||
output.push('You need to generate writeXXX() methods for the SkeletonSerializer class.');
|
||||
output.push('Each method should serialize all properties accessible via getter methods.');
|
||||
output.push('');
|
||||
output.push('## Task: Port the Java SkeletonSerializer to Other Runtimes');
|
||||
output.push('');
|
||||
output.push('A complete Java SkeletonSerializer has been generated at:');
|
||||
output.push('`spine-libgdx/spine-libgdx/src/com/esotericsoftware/spine/utils/SkeletonSerializer.java`');
|
||||
output.push('');
|
||||
output.push('Port this serializer to:');
|
||||
output.push('- **C++**: Create `SkeletonSerializer.h/cpp` in `spine-cpp/src/spine/`');
|
||||
output.push('- **C**: Create `spine_skeleton_serializer.h/c` in `spine-c/src/spine/`');
|
||||
output.push('- **TypeScript**: Create `SkeletonSerializer.ts` in `spine-ts/spine-core/src/`');
|
||||
output.push('');
|
||||
output.push('## Important Porting Notes');
|
||||
output.push('');
|
||||
output.push('1. **Language Differences**:');
|
||||
output.push(' - Java uses getter methods (`getName()`), TypeScript may use properties (`.name`)');
|
||||
output.push(' - C uses function calls (`spBoneData_getName()`)');
|
||||
output.push(' - Adapt to each language\'s idioms');
|
||||
output.push('2. **Type Checking**:');
|
||||
output.push(' - Java uses `instanceof`');
|
||||
output.push(' - C++ uses custom RTTI (`object->getRTTI().instanceOf()`)');
|
||||
output.push(' - C uses type fields or function pointers');
|
||||
output.push(' - TypeScript uses `instanceof`');
|
||||
output.push('3. **Collections**:');
|
||||
output.push(' - Java uses `Array<T>`, `IntArray`, `FloatArray`');
|
||||
output.push(' - C++ uses `Vector<T>`');
|
||||
output.push(' - C uses arrays with size fields');
|
||||
output.push(' - TypeScript uses arrays `T[]`');
|
||||
output.push('');
|
||||
output.push('## Types Reference');
|
||||
output.push('');
|
||||
|
||||
// First emit abstract types that need instanceof delegation
|
||||
for (const [className, classInfo] of classMap) {
|
||||
if ((classInfo.isAbstract || classInfo.isInterface) && classInfo.concreteImplementations && classInfo.concreteImplementations.length > 0) {
|
||||
output.push(`### ${className} (${classInfo.isInterface ? 'interface' : 'abstract'})`);
|
||||
if (classInfo.superTypes.length > 0) {
|
||||
output.push(`Extends: ${classInfo.superTypes.join(', ')}`);
|
||||
}
|
||||
output.push('');
|
||||
output.push('This is an abstract class. Generate a write' + className.split('.').pop() + '() method that checks instanceof for these concrete implementations:');
|
||||
for (const impl of classInfo.concreteImplementations.sort()) {
|
||||
output.push(`- ${impl}`);
|
||||
}
|
||||
output.push('');
|
||||
output.push('Example implementation:');
|
||||
output.push('```java');
|
||||
output.push(`private void write${className.split('.').pop()}(JsonWriter json, ${className.split('.').pop()} obj) throws IOException {`);
|
||||
const first = classInfo.concreteImplementations[0];
|
||||
if (first) {
|
||||
const shortName = first.split('.').pop()!;
|
||||
output.push(` if (obj instanceof ${shortName}) {`);
|
||||
output.push(` write${shortName}(json, (${shortName}) obj);`);
|
||||
output.push(' } // ... etc for all concrete types');
|
||||
output.push(' else {');
|
||||
output.push(` throw new RuntimeException("Unknown ${className.split('.').pop()} type: " + obj.getClass().getName());`);
|
||||
output.push(' }');
|
||||
}
|
||||
output.push('}');
|
||||
output.push('```');
|
||||
output.push('');
|
||||
}
|
||||
}
|
||||
|
||||
// Then emit concrete types
|
||||
const sortedTypes = Array.from(analysisData.allTypesToGenerate).sort();
|
||||
|
||||
for (const typeName of sortedTypes) {
|
||||
const classInfo = classMap.get(typeName)!;
|
||||
|
||||
output.push(`### ${typeName}`);
|
||||
if (classInfo && classInfo.superTypes.length > 0) {
|
||||
output.push(`Extends: ${classInfo.superTypes.join(', ')}`);
|
||||
}
|
||||
output.push('');
|
||||
|
||||
const properties = typeProperties.get(typeName) || [];
|
||||
const getters = properties.filter(p => p.isGetter);
|
||||
const fields = properties.filter(p => !p.isGetter);
|
||||
|
||||
if (getters.length > 0 || fields.length > 0) {
|
||||
if (fields.length > 0) {
|
||||
output.push('Public fields:');
|
||||
output.push('```java');
|
||||
for (const field of fields) {
|
||||
output.push(`${field.name} // ${field.type}`);
|
||||
}
|
||||
output.push('```');
|
||||
output.push('');
|
||||
}
|
||||
|
||||
if (getters.length > 0) {
|
||||
output.push('Getters to serialize:');
|
||||
output.push('```java');
|
||||
for (const getter of getters) {
|
||||
output.push(`${getter.name} // returns ${getter.type}`);
|
||||
}
|
||||
output.push('```');
|
||||
}
|
||||
} else {
|
||||
output.push('*No properties found*');
|
||||
}
|
||||
output.push('');
|
||||
}
|
||||
|
||||
return output.join('\n');
|
||||
}
|
||||
|
||||
async function main() {
|
||||
try {
|
||||
// Read analysis result
|
||||
const analysisFile = path.join(process.cwd(), 'output', 'analysis-result.json');
|
||||
if (!fs.existsSync(analysisFile)) {
|
||||
console.error('Analysis result not found. Run analyze-java-api.ts first.');
|
||||
process.exit(1);
|
||||
}
|
||||
|
||||
const analysisData: SerializedAnalysisResult = JSON.parse(fs.readFileSync(analysisFile, 'utf8'));
|
||||
|
||||
// Generate Claude prompt
|
||||
const prompt = generateClaudePrompt(analysisData);
|
||||
|
||||
// Write the prompt file
|
||||
const outputFile = path.join(process.cwd(), 'output', 'port-serializer-to-other-runtimes.md');
|
||||
fs.writeFileSync(outputFile, prompt);
|
||||
|
||||
console.log(`Claude prompt written to: ${outputFile}`);
|
||||
|
||||
} catch (error: any) {
|
||||
console.error('Error:', error.message);
|
||||
process.exit(1);
|
||||
}
|
||||
}
|
||||
|
||||
// Allow running as a script or importing the function
|
||||
if (import.meta.url === `file://${process.argv[1]}`) {
|
||||
main();
|
||||
}
|
||||
|
||||
export { generateClaudePrompt };
|
||||
549
tests/generate-java-serializer.ts
Normal file
549
tests/generate-java-serializer.ts
Normal file
@ -0,0 +1,549 @@
|
||||
#!/usr/bin/env tsx
|
||||
|
||||
import * as fs from 'fs';
|
||||
import * as path from 'path';
|
||||
import type { ClassInfo, PropertyInfo } from './types';
|
||||
|
||||
interface SerializedAnalysisResult {
|
||||
classMap: [string, ClassInfo][];
|
||||
accessibleTypes: string[];
|
||||
abstractTypes: [string, string[]][];
|
||||
allTypesToGenerate: string[];
|
||||
typeProperties: [string, PropertyInfo[]][];
|
||||
}
|
||||
|
||||
function generateWriteValue(output: string[], expression: string, type: string, indent: string, abstractTypes: Map<string, string[]>, classMap: Map<string, ClassInfo>) {
|
||||
// Handle null annotations
|
||||
const isNullable = type.includes('@Null');
|
||||
type = type.replace(/@Null\s+/g, '').trim();
|
||||
|
||||
// Primitive types
|
||||
if (['String', 'int', 'float', 'boolean', 'short', 'byte', 'double', 'long'].includes(type)) {
|
||||
output.push(`${indent}json.writeValue(${expression});`);
|
||||
return;
|
||||
}
|
||||
|
||||
// Check if it's an enum - need to handle both short and full names
|
||||
let classInfo = classMap.get(type);
|
||||
if (!classInfo && !type.includes('.')) {
|
||||
// Try to find by short name
|
||||
for (const [fullName, info] of classMap) {
|
||||
if (fullName.split('.').pop() === type) {
|
||||
classInfo = info;
|
||||
break;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if (classInfo?.isEnum) {
|
||||
if (isNullable) {
|
||||
output.push(`${indent}if (${expression} == null) {`);
|
||||
output.push(`${indent} json.writeNull();`);
|
||||
output.push(`${indent}} else {`);
|
||||
output.push(`${indent} json.writeValue(${expression}.name());`);
|
||||
output.push(`${indent}}`);
|
||||
} else {
|
||||
output.push(`${indent}json.writeValue(${expression}.name());`);
|
||||
}
|
||||
return;
|
||||
}
|
||||
|
||||
// Arrays
|
||||
if (type.startsWith('Array<')) {
|
||||
const innerType = type.match(/Array<(.+?)>/)![1].trim();
|
||||
output.push(`${indent}if (${expression} == null) {`);
|
||||
output.push(`${indent} json.writeNull();`);
|
||||
output.push(`${indent}} else {`);
|
||||
output.push(`${indent} json.writeArrayStart();`);
|
||||
output.push(`${indent} for (${innerType} item : ${expression}) {`);
|
||||
generateWriteValue(output, 'item', innerType, indent + ' ', abstractTypes, classMap);
|
||||
output.push(`${indent} }`);
|
||||
output.push(`${indent} json.writeArrayEnd();`);
|
||||
output.push(`${indent}}`);
|
||||
return;
|
||||
}
|
||||
|
||||
if (type === 'IntArray' || type === 'FloatArray') {
|
||||
output.push(`${indent}if (${expression} == null) {`);
|
||||
output.push(`${indent} json.writeNull();`);
|
||||
output.push(`${indent}} else {`);
|
||||
output.push(`${indent} json.writeArrayStart();`);
|
||||
output.push(`${indent} for (int i = 0; i < ${expression}.size; i++) {`);
|
||||
output.push(`${indent} json.writeValue(${expression}.get(i));`);
|
||||
output.push(`${indent} }`);
|
||||
output.push(`${indent} json.writeArrayEnd();`);
|
||||
output.push(`${indent}}`);
|
||||
return;
|
||||
}
|
||||
|
||||
if (type.endsWith('[]')) {
|
||||
const elemType = type.slice(0, -2);
|
||||
output.push(`${indent}if (${expression} == null) {`);
|
||||
output.push(`${indent} json.writeNull();`);
|
||||
output.push(`${indent}} else {`);
|
||||
output.push(`${indent} json.writeArrayStart();`);
|
||||
// Handle nested arrays (like float[][])
|
||||
if (elemType.endsWith('[]')) {
|
||||
const nestedType = elemType.slice(0, -2);
|
||||
output.push(`${indent} for (${elemType} nestedArray : ${expression}) {`);
|
||||
output.push(`${indent} if (nestedArray == null) {`);
|
||||
output.push(`${indent} json.writeNull();`);
|
||||
output.push(`${indent} } else {`);
|
||||
output.push(`${indent} json.writeArrayStart();`);
|
||||
output.push(`${indent} for (${nestedType} elem : nestedArray) {`);
|
||||
output.push(`${indent} json.writeValue(elem);`);
|
||||
output.push(`${indent} }`);
|
||||
output.push(`${indent} json.writeArrayEnd();`);
|
||||
output.push(`${indent} }`);
|
||||
output.push(`${indent} }`);
|
||||
} else {
|
||||
output.push(`${indent} for (${elemType} item : ${expression}) {`);
|
||||
generateWriteValue(output, 'item', elemType, indent + ' ', abstractTypes, classMap);
|
||||
output.push(`${indent} }`);
|
||||
}
|
||||
output.push(`${indent} json.writeArrayEnd();`);
|
||||
output.push(`${indent}}`);
|
||||
return;
|
||||
}
|
||||
|
||||
// Special cases for libGDX types
|
||||
if (type === 'Color') {
|
||||
output.push(`${indent}writeColor(json, ${expression});`);
|
||||
return;
|
||||
}
|
||||
|
||||
if (type === 'TextureRegion') {
|
||||
output.push(`${indent}writeTextureRegion(json, ${expression});`);
|
||||
return;
|
||||
}
|
||||
|
||||
// Handle objects
|
||||
const shortType = type.split('.').pop()!;
|
||||
|
||||
// Check if this type exists in classMap (for abstract types that might not be in generated methods)
|
||||
let foundInClassMap = classMap.has(type);
|
||||
if (!foundInClassMap && !type.includes('.')) {
|
||||
// Try to find by short name
|
||||
for (const [fullName, info] of classMap) {
|
||||
if (fullName.split('.').pop() === type) {
|
||||
foundInClassMap = true;
|
||||
// If it's abstract/interface, we need the instanceof chain
|
||||
if (info.isAbstract || info.isInterface) {
|
||||
type = fullName; // Use full name for abstract types
|
||||
}
|
||||
break;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if (isNullable) {
|
||||
output.push(`${indent}if (${expression} == null) {`);
|
||||
output.push(`${indent} json.writeNull();`);
|
||||
output.push(`${indent}} else {`);
|
||||
output.push(`${indent} write${shortType}(json, ${expression});`);
|
||||
output.push(`${indent}}`);
|
||||
} else {
|
||||
output.push(`${indent}write${shortType}(json, ${expression});`);
|
||||
}
|
||||
}
|
||||
|
||||
function generateJavaSerializer(analysisData: SerializedAnalysisResult): string {
|
||||
const javaOutput: string[] = [];
|
||||
|
||||
// Convert arrays back to Maps
|
||||
const classMap = new Map(analysisData.classMap);
|
||||
const abstractTypes = new Map(analysisData.abstractTypes);
|
||||
const typeProperties = new Map(analysisData.typeProperties);
|
||||
|
||||
// Collect all types that need write methods
|
||||
const typesNeedingMethods = new Set<string>();
|
||||
|
||||
// Add all types from allTypesToGenerate
|
||||
for (const type of analysisData.allTypesToGenerate) {
|
||||
typesNeedingMethods.add(type);
|
||||
}
|
||||
|
||||
// Add all abstract types that are referenced
|
||||
for (const [abstractType] of abstractTypes) {
|
||||
typesNeedingMethods.add(abstractType);
|
||||
}
|
||||
|
||||
// Add types referenced in properties
|
||||
for (const [typeName, props] of typeProperties) {
|
||||
if (!typesNeedingMethods.has(typeName)) continue;
|
||||
|
||||
for (const prop of props) {
|
||||
let propType = prop.type.replace(/@Null\s+/g, '').trim();
|
||||
|
||||
// Extract type from Array<Type>
|
||||
const arrayMatch = propType.match(/Array<(.+?)>/);
|
||||
if (arrayMatch) {
|
||||
propType = arrayMatch[1].trim();
|
||||
}
|
||||
|
||||
// Extract type from Type[]
|
||||
if (propType.endsWith('[]')) {
|
||||
propType = propType.slice(0, -2);
|
||||
}
|
||||
|
||||
// Skip primitives and special types
|
||||
if (['String', 'int', 'float', 'boolean', 'short', 'byte', 'double', 'long',
|
||||
'Color', 'TextureRegion', 'IntArray', 'FloatArray'].includes(propType)) {
|
||||
continue;
|
||||
}
|
||||
|
||||
// Add the type if it's a class
|
||||
if (propType.match(/^[A-Z]/)) {
|
||||
typesNeedingMethods.add(propType);
|
||||
|
||||
// Also check if it's an abstract type in classMap
|
||||
let found = false;
|
||||
for (const [fullName, info] of classMap) {
|
||||
if (fullName === propType || fullName.split('.').pop() === propType) {
|
||||
if (info.isAbstract || info.isInterface) {
|
||||
typesNeedingMethods.add(fullName);
|
||||
}
|
||||
found = true;
|
||||
break;
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Generate Java file header
|
||||
javaOutput.push('package com.esotericsoftware.spine.utils;');
|
||||
javaOutput.push('');
|
||||
javaOutput.push('import com.esotericsoftware.spine.*;');
|
||||
javaOutput.push('import com.esotericsoftware.spine.Animation.*;');
|
||||
javaOutput.push('import com.esotericsoftware.spine.AnimationState.*;');
|
||||
javaOutput.push('import com.esotericsoftware.spine.BoneData.Inherit;');
|
||||
javaOutput.push('import com.esotericsoftware.spine.Skin.SkinEntry;');
|
||||
javaOutput.push('import com.esotericsoftware.spine.PathConstraintData.*;');
|
||||
javaOutput.push('import com.esotericsoftware.spine.TransformConstraintData.*;');
|
||||
javaOutput.push('import com.esotericsoftware.spine.attachments.*;');
|
||||
javaOutput.push('import com.badlogic.gdx.graphics.Color;');
|
||||
javaOutput.push('import com.badlogic.gdx.graphics.g2d.TextureRegion;');
|
||||
javaOutput.push('import com.badlogic.gdx.utils.Array;');
|
||||
javaOutput.push('import com.badlogic.gdx.utils.IntArray;');
|
||||
javaOutput.push('import com.badlogic.gdx.utils.FloatArray;');
|
||||
javaOutput.push('');
|
||||
javaOutput.push('import java.io.Writer;');
|
||||
javaOutput.push('import java.io.IOException;');
|
||||
javaOutput.push('import java.util.Locale;');
|
||||
javaOutput.push('import java.util.Set;');
|
||||
javaOutput.push('import java.util.HashSet;');
|
||||
javaOutput.push('');
|
||||
javaOutput.push('public class SkeletonSerializer {');
|
||||
javaOutput.push(' private final Set<Object> visitedObjects = new HashSet<>();');
|
||||
javaOutput.push('');
|
||||
|
||||
// Generate main entry methods
|
||||
javaOutput.push(' public void serializeSkeletonData(SkeletonData data, Writer writer) throws IOException {');
|
||||
javaOutput.push(' visitedObjects.clear();');
|
||||
javaOutput.push(' JsonWriter json = new JsonWriter(writer);');
|
||||
javaOutput.push(' writeSkeletonData(json, data);');
|
||||
javaOutput.push(' json.close();');
|
||||
javaOutput.push(' }');
|
||||
javaOutput.push('');
|
||||
javaOutput.push(' public void serializeSkeleton(Skeleton skeleton, Writer writer) throws IOException {');
|
||||
javaOutput.push(' visitedObjects.clear();');
|
||||
javaOutput.push(' JsonWriter json = new JsonWriter(writer);');
|
||||
javaOutput.push(' writeSkeleton(json, skeleton);');
|
||||
javaOutput.push(' json.close();');
|
||||
javaOutput.push(' }');
|
||||
javaOutput.push('');
|
||||
javaOutput.push(' public void serializeAnimationState(AnimationState state, Writer writer) throws IOException {');
|
||||
javaOutput.push(' visitedObjects.clear();');
|
||||
javaOutput.push(' JsonWriter json = new JsonWriter(writer);');
|
||||
javaOutput.push(' writeAnimationState(json, state);');
|
||||
javaOutput.push(' json.close();');
|
||||
javaOutput.push(' }');
|
||||
javaOutput.push('');
|
||||
|
||||
// Generate write methods for all types
|
||||
const generatedMethods = new Set<string>();
|
||||
|
||||
for (const typeName of Array.from(typesNeedingMethods).sort()) {
|
||||
const classInfo = classMap.get(typeName);
|
||||
if (!classInfo) continue;
|
||||
|
||||
const shortName = typeName.split('.').pop()!;
|
||||
|
||||
// Skip if already generated (handle name collisions)
|
||||
if (generatedMethods.has(shortName)) continue;
|
||||
generatedMethods.add(shortName);
|
||||
|
||||
// Use full class name for inner classes
|
||||
const className = typeName.includes('.') ? typeName : shortName;
|
||||
|
||||
javaOutput.push(` private void write${shortName}(JsonWriter json, ${className} obj) throws IOException {`);
|
||||
|
||||
if (classInfo.isEnum) {
|
||||
// Handle enums
|
||||
javaOutput.push(' json.writeValue(obj.name());');
|
||||
} else if (classInfo.isAbstract || classInfo.isInterface) {
|
||||
// Handle abstract types with instanceof chain
|
||||
const implementations = classInfo.concreteImplementations || [];
|
||||
if (implementations.length === 0) {
|
||||
javaOutput.push(' json.writeNull(); // No concrete implementations');
|
||||
} else {
|
||||
let first = true;
|
||||
for (const impl of implementations) {
|
||||
const implShortName = impl.split('.').pop()!;
|
||||
const implClassName = impl.includes('.') ? impl : implShortName;
|
||||
|
||||
if (first) {
|
||||
javaOutput.push(` if (obj instanceof ${implClassName}) {`);
|
||||
first = false;
|
||||
} else {
|
||||
javaOutput.push(` } else if (obj instanceof ${implClassName}) {`);
|
||||
}
|
||||
javaOutput.push(` write${implShortName}(json, (${implClassName}) obj);`);
|
||||
}
|
||||
javaOutput.push(' } else {');
|
||||
javaOutput.push(` throw new RuntimeException("Unknown ${shortName} type: " + obj.getClass().getName());`);
|
||||
javaOutput.push(' }');
|
||||
}
|
||||
} else {
|
||||
// Handle concrete types
|
||||
const properties = typeProperties.get(typeName) || [];
|
||||
|
||||
// Add cycle detection
|
||||
javaOutput.push(' if (visitedObjects.contains(obj)) {');
|
||||
javaOutput.push(' json.writeValue("<circular>");');
|
||||
javaOutput.push(' return;');
|
||||
javaOutput.push(' }');
|
||||
javaOutput.push(' visitedObjects.add(obj);');
|
||||
javaOutput.push('');
|
||||
|
||||
javaOutput.push(' json.writeObjectStart();');
|
||||
|
||||
// Write type field
|
||||
javaOutput.push(' json.writeName("type");');
|
||||
javaOutput.push(` json.writeValue("${shortName}");`);
|
||||
|
||||
// Write properties
|
||||
for (const prop of properties) {
|
||||
const propName = prop.isGetter ?
|
||||
prop.name.replace('get', '').replace('()', '').charAt(0).toLowerCase() +
|
||||
prop.name.replace('get', '').replace('()', '').slice(1) :
|
||||
prop.name;
|
||||
|
||||
javaOutput.push('');
|
||||
javaOutput.push(` json.writeName("${propName}");`);
|
||||
const accessor = prop.isGetter ? `obj.${prop.name}` : `obj.${prop.name}`;
|
||||
generateWriteValue(javaOutput, accessor, prop.type, ' ', abstractTypes, classMap);
|
||||
}
|
||||
|
||||
javaOutput.push('');
|
||||
javaOutput.push(' json.writeObjectEnd();');
|
||||
}
|
||||
|
||||
javaOutput.push(' }');
|
||||
javaOutput.push('');
|
||||
}
|
||||
|
||||
// Add helper methods
|
||||
javaOutput.push(' private void writeColor(JsonWriter json, Color color) throws IOException {');
|
||||
javaOutput.push(' if (color == null) {');
|
||||
javaOutput.push(' json.writeNull();');
|
||||
javaOutput.push(' } else {');
|
||||
javaOutput.push(' json.writeObjectStart();');
|
||||
javaOutput.push(' json.writeName("r");');
|
||||
javaOutput.push(' json.writeValue(color.r);');
|
||||
javaOutput.push(' json.writeName("g");');
|
||||
javaOutput.push(' json.writeValue(color.g);');
|
||||
javaOutput.push(' json.writeName("b");');
|
||||
javaOutput.push(' json.writeValue(color.b);');
|
||||
javaOutput.push(' json.writeName("a");');
|
||||
javaOutput.push(' json.writeValue(color.a);');
|
||||
javaOutput.push(' json.writeObjectEnd();');
|
||||
javaOutput.push(' }');
|
||||
javaOutput.push(' }');
|
||||
javaOutput.push('');
|
||||
|
||||
javaOutput.push(' private void writeTextureRegion(JsonWriter json, TextureRegion region) throws IOException {');
|
||||
javaOutput.push(' if (region == null) {');
|
||||
javaOutput.push(' json.writeNull();');
|
||||
javaOutput.push(' } else {');
|
||||
javaOutput.push(' json.writeObjectStart();');
|
||||
javaOutput.push(' json.writeName("u");');
|
||||
javaOutput.push(' json.writeValue(region.getU());');
|
||||
javaOutput.push(' json.writeName("v");');
|
||||
javaOutput.push(' json.writeValue(region.getV());');
|
||||
javaOutput.push(' json.writeName("u2");');
|
||||
javaOutput.push(' json.writeValue(region.getU2());');
|
||||
javaOutput.push(' json.writeName("v2");');
|
||||
javaOutput.push(' json.writeValue(region.getV2());');
|
||||
javaOutput.push(' json.writeName("width");');
|
||||
javaOutput.push(' json.writeValue(region.getRegionWidth());');
|
||||
javaOutput.push(' json.writeName("height");');
|
||||
javaOutput.push(' json.writeValue(region.getRegionHeight());');
|
||||
javaOutput.push(' json.writeObjectEnd();');
|
||||
javaOutput.push(' }');
|
||||
javaOutput.push(' }');
|
||||
javaOutput.push('');
|
||||
|
||||
// Add JsonWriter inner class
|
||||
javaOutput.push(' private static class JsonWriter {');
|
||||
javaOutput.push(' private final Writer writer;');
|
||||
javaOutput.push(' private int depth = 0;');
|
||||
javaOutput.push(' private boolean needsComma = false;');
|
||||
javaOutput.push('');
|
||||
javaOutput.push(' JsonWriter(Writer writer) {');
|
||||
javaOutput.push(' this.writer = writer;');
|
||||
javaOutput.push(' }');
|
||||
javaOutput.push('');
|
||||
javaOutput.push(' void writeObjectStart() throws IOException {');
|
||||
javaOutput.push(' writeCommaIfNeeded();');
|
||||
javaOutput.push(' writer.write("{");');
|
||||
javaOutput.push(' depth++;');
|
||||
javaOutput.push(' needsComma = false;');
|
||||
javaOutput.push(' }');
|
||||
javaOutput.push('');
|
||||
javaOutput.push(' void writeObjectEnd() throws IOException {');
|
||||
javaOutput.push(' depth--;');
|
||||
javaOutput.push(' if (needsComma) {');
|
||||
javaOutput.push(' writer.write("\\n");');
|
||||
javaOutput.push(' writeIndent();');
|
||||
javaOutput.push(' }');
|
||||
javaOutput.push(' writer.write("}");');
|
||||
javaOutput.push(' needsComma = true;');
|
||||
javaOutput.push(' }');
|
||||
javaOutput.push('');
|
||||
javaOutput.push(' void writeArrayStart() throws IOException {');
|
||||
javaOutput.push(' writeCommaIfNeeded();');
|
||||
javaOutput.push(' writer.write("[");');
|
||||
javaOutput.push(' depth++;');
|
||||
javaOutput.push(' needsComma = false;');
|
||||
javaOutput.push(' }');
|
||||
javaOutput.push('');
|
||||
javaOutput.push(' void writeArrayEnd() throws IOException {');
|
||||
javaOutput.push(' depth--;');
|
||||
javaOutput.push(' if (needsComma) {');
|
||||
javaOutput.push(' writer.write("\\n");');
|
||||
javaOutput.push(' writeIndent();');
|
||||
javaOutput.push(' }');
|
||||
javaOutput.push(' writer.write("]");');
|
||||
javaOutput.push(' needsComma = true;');
|
||||
javaOutput.push(' }');
|
||||
javaOutput.push('');
|
||||
javaOutput.push(' void writeName(String name) throws IOException {');
|
||||
javaOutput.push(' writeCommaIfNeeded();');
|
||||
javaOutput.push(' writer.write("\\n");');
|
||||
javaOutput.push(' writeIndent();');
|
||||
javaOutput.push(' writer.write("\\"" + name + "\\": ");');
|
||||
javaOutput.push(' needsComma = false;');
|
||||
javaOutput.push(' }');
|
||||
javaOutput.push('');
|
||||
javaOutput.push(' void writeValue(String value) throws IOException {');
|
||||
javaOutput.push(' writeCommaIfNeeded();');
|
||||
javaOutput.push(' if (value == null) {');
|
||||
javaOutput.push(' writer.write("null");');
|
||||
javaOutput.push(' } else {');
|
||||
javaOutput.push(' writer.write("\\"" + escapeString(value) + "\\"");');
|
||||
javaOutput.push(' }');
|
||||
javaOutput.push(' needsComma = true;');
|
||||
javaOutput.push(' }');
|
||||
javaOutput.push('');
|
||||
javaOutput.push(' void writeValue(float value) throws IOException {');
|
||||
javaOutput.push(' writeCommaIfNeeded();');
|
||||
javaOutput.push(' writer.write(String.format(Locale.US, "%.6f", value).replaceAll("0+$", "").replaceAll("\\\\.$", ""));');
|
||||
javaOutput.push(' needsComma = true;');
|
||||
javaOutput.push(' }');
|
||||
javaOutput.push('');
|
||||
javaOutput.push(' void writeValue(int value) throws IOException {');
|
||||
javaOutput.push(' writeCommaIfNeeded();');
|
||||
javaOutput.push(' writer.write(String.valueOf(value));');
|
||||
javaOutput.push(' needsComma = true;');
|
||||
javaOutput.push(' }');
|
||||
javaOutput.push('');
|
||||
javaOutput.push(' void writeValue(boolean value) throws IOException {');
|
||||
javaOutput.push(' writeCommaIfNeeded();');
|
||||
javaOutput.push(' writer.write(String.valueOf(value));');
|
||||
javaOutput.push(' needsComma = true;');
|
||||
javaOutput.push(' }');
|
||||
javaOutput.push('');
|
||||
javaOutput.push(' void writeNull() throws IOException {');
|
||||
javaOutput.push(' writeCommaIfNeeded();');
|
||||
javaOutput.push(' writer.write("null");');
|
||||
javaOutput.push(' needsComma = true;');
|
||||
javaOutput.push(' }');
|
||||
javaOutput.push('');
|
||||
javaOutput.push(' void close() throws IOException {');
|
||||
javaOutput.push(' writer.write("\\n");');
|
||||
javaOutput.push(' writer.flush();');
|
||||
javaOutput.push(' }');
|
||||
javaOutput.push('');
|
||||
javaOutput.push(' private void writeCommaIfNeeded() throws IOException {');
|
||||
javaOutput.push(' if (needsComma) {');
|
||||
javaOutput.push(' writer.write(",");');
|
||||
javaOutput.push(' }');
|
||||
javaOutput.push(' }');
|
||||
javaOutput.push('');
|
||||
javaOutput.push(' private void writeIndent() throws IOException {');
|
||||
javaOutput.push(' for (int i = 0; i < depth; i++) {');
|
||||
javaOutput.push(' writer.write(" ");');
|
||||
javaOutput.push(' }');
|
||||
javaOutput.push(' }');
|
||||
javaOutput.push('');
|
||||
javaOutput.push(' private String escapeString(String str) {');
|
||||
javaOutput.push(' return str.replace("\\\\", "\\\\\\\\")');
|
||||
javaOutput.push(' .replace("\\"", "\\\\\\"")');
|
||||
javaOutput.push(' .replace("\\b", "\\\\b")');
|
||||
javaOutput.push(' .replace("\\f", "\\\\f")');
|
||||
javaOutput.push(' .replace("\\n", "\\\\n")');
|
||||
javaOutput.push(' .replace("\\r", "\\\\r")');
|
||||
javaOutput.push(' .replace("\\t", "\\\\t");');
|
||||
javaOutput.push(' }');
|
||||
javaOutput.push(' }');
|
||||
javaOutput.push('}');
|
||||
|
||||
return javaOutput.join('\n');
|
||||
}
|
||||
|
||||
async function main() {
|
||||
try {
|
||||
// Read analysis result
|
||||
const analysisFile = path.join(process.cwd(), 'output', 'analysis-result.json');
|
||||
if (!fs.existsSync(analysisFile)) {
|
||||
console.error('Analysis result not found. Run analyze-java-api.ts first.');
|
||||
process.exit(1);
|
||||
}
|
||||
|
||||
const analysisData: SerializedAnalysisResult = JSON.parse(fs.readFileSync(analysisFile, 'utf8'));
|
||||
|
||||
// Generate Java serializer
|
||||
const javaCode = generateJavaSerializer(analysisData);
|
||||
|
||||
// Write the Java file
|
||||
const javaFile = path.join(
|
||||
path.dirname(process.cwd()),
|
||||
'spine-libgdx',
|
||||
'spine-libgdx',
|
||||
'src',
|
||||
'com',
|
||||
'esotericsoftware',
|
||||
'spine',
|
||||
'utils',
|
||||
'SkeletonSerializer.java'
|
||||
);
|
||||
|
||||
fs.mkdirSync(path.dirname(javaFile), { recursive: true });
|
||||
fs.writeFileSync(javaFile, javaCode);
|
||||
|
||||
console.log(`Generated Java serializer: ${javaFile}`);
|
||||
|
||||
} catch (error: any) {
|
||||
console.error('Error:', error.message);
|
||||
process.exit(1);
|
||||
}
|
||||
}
|
||||
|
||||
// Allow running as a script or importing the function
|
||||
if (import.meta.url === `file://${process.argv[1]}`) {
|
||||
main();
|
||||
}
|
||||
|
||||
export { generateJavaSerializer };
|
||||
189
tests/package-lock.json
generated
189
tests/package-lock.json
generated
@ -5,6 +5,9 @@
|
||||
"packages": {
|
||||
"": {
|
||||
"name": "spine-tests",
|
||||
"dependencies": {
|
||||
"@mariozechner/lsp-cli": "^0.1.1"
|
||||
},
|
||||
"devDependencies": {
|
||||
"@types/node": "^20.0.0",
|
||||
"tsx": "^4.0.0"
|
||||
@ -452,6 +455,22 @@
|
||||
"node": ">=18"
|
||||
}
|
||||
},
|
||||
"node_modules/@mariozechner/lsp-cli": {
|
||||
"version": "0.1.1",
|
||||
"resolved": "https://registry.npmjs.org/@mariozechner/lsp-cli/-/lsp-cli-0.1.1.tgz",
|
||||
"integrity": "sha512-/5HF/PoYhQKFMhXLQiH1ZHTBJfs7rB8xcXa4YXSq6aTR5G6tWOrE6fwFbK7pwtkry6ZFeCTE2HI4BRWv5La9Qg==",
|
||||
"dependencies": {
|
||||
"chalk": "^5.4.1",
|
||||
"commander": "^11.0.0",
|
||||
"node-stream-zip": "^1.15.0",
|
||||
"tar": "^6.2.0",
|
||||
"vscode-jsonrpc": "^8.2.0",
|
||||
"vscode-languageserver-protocol": "^3.17.0"
|
||||
},
|
||||
"bin": {
|
||||
"lsp-cli": "dist/index.js"
|
||||
}
|
||||
},
|
||||
"node_modules/@types/node": {
|
||||
"version": "20.19.7",
|
||||
"resolved": "https://registry.npmjs.org/@types/node/-/node-20.19.7.tgz",
|
||||
@ -462,6 +481,36 @@
|
||||
"undici-types": "~6.21.0"
|
||||
}
|
||||
},
|
||||
"node_modules/chalk": {
|
||||
"version": "5.4.1",
|
||||
"resolved": "https://registry.npmjs.org/chalk/-/chalk-5.4.1.tgz",
|
||||
"integrity": "sha512-zgVZuo2WcZgfUEmsn6eO3kINexW8RAE4maiQ8QNs8CtpPCSyMiYsULR3HQYkm3w8FIA3SberyMJMSldGsW+U3w==",
|
||||
"license": "MIT",
|
||||
"engines": {
|
||||
"node": "^12.17.0 || ^14.13 || >=16.0.0"
|
||||
},
|
||||
"funding": {
|
||||
"url": "https://github.com/chalk/chalk?sponsor=1"
|
||||
}
|
||||
},
|
||||
"node_modules/chownr": {
|
||||
"version": "2.0.0",
|
||||
"resolved": "https://registry.npmjs.org/chownr/-/chownr-2.0.0.tgz",
|
||||
"integrity": "sha512-bIomtDF5KGpdogkLd9VspvFzk9KfpyyGlS8YFVZl7TGPBHL5snIOnxeshwVgPteQ9b4Eydl+pVbIyE1DcvCWgQ==",
|
||||
"license": "ISC",
|
||||
"engines": {
|
||||
"node": ">=10"
|
||||
}
|
||||
},
|
||||
"node_modules/commander": {
|
||||
"version": "11.1.0",
|
||||
"resolved": "https://registry.npmjs.org/commander/-/commander-11.1.0.tgz",
|
||||
"integrity": "sha512-yPVavfyCcRhmorC7rWlkHn15b4wDVgVmBA7kV4QVBsF7kv/9TKJAbAXVTxvTnwP8HHKjRCJDClKbciiYS7p0DQ==",
|
||||
"license": "MIT",
|
||||
"engines": {
|
||||
"node": ">=16"
|
||||
}
|
||||
},
|
||||
"node_modules/esbuild": {
|
||||
"version": "0.25.6",
|
||||
"resolved": "https://registry.npmjs.org/esbuild/-/esbuild-0.25.6.tgz",
|
||||
@ -504,6 +553,30 @@
|
||||
"@esbuild/win32-x64": "0.25.6"
|
||||
}
|
||||
},
|
||||
"node_modules/fs-minipass": {
|
||||
"version": "2.1.0",
|
||||
"resolved": "https://registry.npmjs.org/fs-minipass/-/fs-minipass-2.1.0.tgz",
|
||||
"integrity": "sha512-V/JgOLFCS+R6Vcq0slCuaeWEdNC3ouDlJMNIsacH2VtALiu9mV4LPrHc5cDl8k5aw6J8jwgWWpiTo5RYhmIzvg==",
|
||||
"license": "ISC",
|
||||
"dependencies": {
|
||||
"minipass": "^3.0.0"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">= 8"
|
||||
}
|
||||
},
|
||||
"node_modules/fs-minipass/node_modules/minipass": {
|
||||
"version": "3.3.6",
|
||||
"resolved": "https://registry.npmjs.org/minipass/-/minipass-3.3.6.tgz",
|
||||
"integrity": "sha512-DxiNidxSEK+tHG6zOIklvNOwm3hvCrbUrdtzY74U6HKTJxvIDfOUL5W5P2Ghd3DTkhhKPYGqeNUIh5qcM4YBfw==",
|
||||
"license": "ISC",
|
||||
"dependencies": {
|
||||
"yallist": "^4.0.0"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=8"
|
||||
}
|
||||
},
|
||||
"node_modules/fsevents": {
|
||||
"version": "2.3.3",
|
||||
"resolved": "https://registry.npmjs.org/fsevents/-/fsevents-2.3.3.tgz",
|
||||
@ -532,6 +605,65 @@
|
||||
"url": "https://github.com/privatenumber/get-tsconfig?sponsor=1"
|
||||
}
|
||||
},
|
||||
"node_modules/minipass": {
|
||||
"version": "5.0.0",
|
||||
"resolved": "https://registry.npmjs.org/minipass/-/minipass-5.0.0.tgz",
|
||||
"integrity": "sha512-3FnjYuehv9k6ovOEbyOswadCDPX1piCfhV8ncmYtHOjuPwylVWsghTLo7rabjC3Rx5xD4HDx8Wm1xnMF7S5qFQ==",
|
||||
"license": "ISC",
|
||||
"engines": {
|
||||
"node": ">=8"
|
||||
}
|
||||
},
|
||||
"node_modules/minizlib": {
|
||||
"version": "2.1.2",
|
||||
"resolved": "https://registry.npmjs.org/minizlib/-/minizlib-2.1.2.tgz",
|
||||
"integrity": "sha512-bAxsR8BVfj60DWXHE3u30oHzfl4G7khkSuPW+qvpd7jFRHm7dLxOjUk1EHACJ/hxLY8phGJ0YhYHZo7jil7Qdg==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"minipass": "^3.0.0",
|
||||
"yallist": "^4.0.0"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">= 8"
|
||||
}
|
||||
},
|
||||
"node_modules/minizlib/node_modules/minipass": {
|
||||
"version": "3.3.6",
|
||||
"resolved": "https://registry.npmjs.org/minipass/-/minipass-3.3.6.tgz",
|
||||
"integrity": "sha512-DxiNidxSEK+tHG6zOIklvNOwm3hvCrbUrdtzY74U6HKTJxvIDfOUL5W5P2Ghd3DTkhhKPYGqeNUIh5qcM4YBfw==",
|
||||
"license": "ISC",
|
||||
"dependencies": {
|
||||
"yallist": "^4.0.0"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=8"
|
||||
}
|
||||
},
|
||||
"node_modules/mkdirp": {
|
||||
"version": "1.0.4",
|
||||
"resolved": "https://registry.npmjs.org/mkdirp/-/mkdirp-1.0.4.tgz",
|
||||
"integrity": "sha512-vVqVZQyf3WLx2Shd0qJ9xuvqgAyKPLAiqITEtqW0oIUjzo3PePDd6fW9iFz30ef7Ysp/oiWqbhszeGWW2T6Gzw==",
|
||||
"license": "MIT",
|
||||
"bin": {
|
||||
"mkdirp": "bin/cmd.js"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=10"
|
||||
}
|
||||
},
|
||||
"node_modules/node-stream-zip": {
|
||||
"version": "1.15.0",
|
||||
"resolved": "https://registry.npmjs.org/node-stream-zip/-/node-stream-zip-1.15.0.tgz",
|
||||
"integrity": "sha512-LN4fydt9TqhZhThkZIVQnF9cwjU3qmUH9h78Mx/K7d3VvfRqqwthLwJEUOEL0QPZ0XQmNN7be5Ggit5+4dq3Bw==",
|
||||
"license": "MIT",
|
||||
"engines": {
|
||||
"node": ">=0.12.0"
|
||||
},
|
||||
"funding": {
|
||||
"type": "github",
|
||||
"url": "https://github.com/sponsors/antelle"
|
||||
}
|
||||
},
|
||||
"node_modules/resolve-pkg-maps": {
|
||||
"version": "1.0.0",
|
||||
"resolved": "https://registry.npmjs.org/resolve-pkg-maps/-/resolve-pkg-maps-1.0.0.tgz",
|
||||
@ -542,6 +674,23 @@
|
||||
"url": "https://github.com/privatenumber/resolve-pkg-maps?sponsor=1"
|
||||
}
|
||||
},
|
||||
"node_modules/tar": {
|
||||
"version": "6.2.1",
|
||||
"resolved": "https://registry.npmjs.org/tar/-/tar-6.2.1.tgz",
|
||||
"integrity": "sha512-DZ4yORTwrbTj/7MZYq2w+/ZFdI6OZ/f9SFHR+71gIVUZhOQPHzVCLpvRnPgyaMpfWxxk/4ONva3GQSyNIKRv6A==",
|
||||
"license": "ISC",
|
||||
"dependencies": {
|
||||
"chownr": "^2.0.0",
|
||||
"fs-minipass": "^2.0.0",
|
||||
"minipass": "^5.0.0",
|
||||
"minizlib": "^2.1.1",
|
||||
"mkdirp": "^1.0.3",
|
||||
"yallist": "^4.0.0"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=10"
|
||||
}
|
||||
},
|
||||
"node_modules/tsx": {
|
||||
"version": "4.20.3",
|
||||
"resolved": "https://registry.npmjs.org/tsx/-/tsx-4.20.3.tgz",
|
||||
@ -568,6 +717,46 @@
|
||||
"integrity": "sha512-iwDZqg0QAGrg9Rav5H4n0M64c3mkR59cJ6wQp+7C4nI0gsmExaedaYLNO44eT4AtBBwjbTiGPMlt2Md0T9H9JQ==",
|
||||
"dev": true,
|
||||
"license": "MIT"
|
||||
},
|
||||
"node_modules/vscode-jsonrpc": {
|
||||
"version": "8.2.1",
|
||||
"resolved": "https://registry.npmjs.org/vscode-jsonrpc/-/vscode-jsonrpc-8.2.1.tgz",
|
||||
"integrity": "sha512-kdjOSJ2lLIn7r1rtrMbbNCHjyMPfRnowdKjBQ+mGq6NAW5QY2bEZC/khaC5OR8svbbjvLEaIXkOq45e2X9BIbQ==",
|
||||
"license": "MIT",
|
||||
"engines": {
|
||||
"node": ">=14.0.0"
|
||||
}
|
||||
},
|
||||
"node_modules/vscode-languageserver-protocol": {
|
||||
"version": "3.17.5",
|
||||
"resolved": "https://registry.npmjs.org/vscode-languageserver-protocol/-/vscode-languageserver-protocol-3.17.5.tgz",
|
||||
"integrity": "sha512-mb1bvRJN8SVznADSGWM9u/b07H7Ecg0I3OgXDuLdn307rl/J3A9YD6/eYOssqhecL27hK1IPZAsaqh00i/Jljg==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"vscode-jsonrpc": "8.2.0",
|
||||
"vscode-languageserver-types": "3.17.5"
|
||||
}
|
||||
},
|
||||
"node_modules/vscode-languageserver-protocol/node_modules/vscode-jsonrpc": {
|
||||
"version": "8.2.0",
|
||||
"resolved": "https://registry.npmjs.org/vscode-jsonrpc/-/vscode-jsonrpc-8.2.0.tgz",
|
||||
"integrity": "sha512-C+r0eKJUIfiDIfwJhria30+TYWPtuHJXHtI7J0YlOmKAo7ogxP20T0zxB7HZQIFhIyvoBPwWskjxrvAtfjyZfA==",
|
||||
"license": "MIT",
|
||||
"engines": {
|
||||
"node": ">=14.0.0"
|
||||
}
|
||||
},
|
||||
"node_modules/vscode-languageserver-types": {
|
||||
"version": "3.17.5",
|
||||
"resolved": "https://registry.npmjs.org/vscode-languageserver-types/-/vscode-languageserver-types-3.17.5.tgz",
|
||||
"integrity": "sha512-Ld1VelNuX9pdF39h2Hgaeb5hEZM2Z3jUrrMgWQAu82jMtZp7p3vJT3BzToKtZI7NgQssZje5o0zryOrhQvzQAg==",
|
||||
"license": "MIT"
|
||||
},
|
||||
"node_modules/yallist": {
|
||||
"version": "4.0.0",
|
||||
"resolved": "https://registry.npmjs.org/yallist/-/yallist-4.0.0.tgz",
|
||||
"integrity": "sha512-3wdGidZyq5PB084XLES5TpOSRA3wjXAlIWMhum2kRcv/41Sn2emQ0dycQW4uZXLejwKvg6EsvbdlVL+FYEct7A==",
|
||||
"license": "ISC"
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@ -8,5 +8,8 @@
|
||||
"devDependencies": {
|
||||
"@types/node": "^20.0.0",
|
||||
"tsx": "^4.0.0"
|
||||
},
|
||||
"dependencies": {
|
||||
"@mariozechner/lsp-cli": "^0.1.1"
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
68
tests/types.ts
Normal file
68
tests/types.ts
Normal file
@ -0,0 +1,68 @@
|
||||
// Shared types for the Spine serializer generator
|
||||
|
||||
// Match lsp-cli's Supertype interface
|
||||
export interface Supertype {
|
||||
name: string;
|
||||
typeArguments?: string[];
|
||||
}
|
||||
|
||||
// Match lsp-cli's SymbolInfo interface (we call it Symbol for backward compatibility)
|
||||
export interface Symbol {
|
||||
name: string;
|
||||
kind: string;
|
||||
file: string;
|
||||
preview: string;
|
||||
documentation?: string;
|
||||
typeParameters?: string[];
|
||||
supertypes?: Supertype[];
|
||||
children?: Symbol[];
|
||||
// We don't need range and definition for our use case
|
||||
}
|
||||
|
||||
export interface LspOutput {
|
||||
language: string;
|
||||
directory: string;
|
||||
symbols: Symbol[];
|
||||
}
|
||||
|
||||
export interface ClassInfo {
|
||||
className: string;
|
||||
superTypes: string[]; // Just the names for backward compatibility
|
||||
superTypeDetails?: Supertype[]; // Full details with type arguments
|
||||
getters: GetterInfo[];
|
||||
fields: FieldInfo[];
|
||||
file: string;
|
||||
isAbstract: boolean;
|
||||
isInterface: boolean;
|
||||
isEnum: boolean;
|
||||
typeParameters?: string[]; // The class's own type parameters
|
||||
enumValues?: string[]; // For enums
|
||||
concreteImplementations?: string[]; // For abstract classes/interfaces - only leaf concrete types
|
||||
allImplementations?: string[]; // For abstract classes/interfaces - includes intermediate abstract types
|
||||
}
|
||||
|
||||
export interface GetterInfo {
|
||||
methodName: string;
|
||||
returnType: string;
|
||||
}
|
||||
|
||||
export interface FieldInfo {
|
||||
fieldName: string;
|
||||
fieldType: string;
|
||||
isFinal: boolean;
|
||||
}
|
||||
|
||||
export interface PropertyInfo {
|
||||
name: string;
|
||||
type: string;
|
||||
isGetter: boolean;
|
||||
inheritedFrom?: string; // Which class this property was inherited from
|
||||
}
|
||||
|
||||
export interface AnalysisResult {
|
||||
classMap: Map<string, ClassInfo>;
|
||||
accessibleTypes: Set<string>;
|
||||
abstractTypes: Map<string, string[]>; // abstract type -> concrete implementations
|
||||
allTypesToGenerate: Set<string>; // all types that need write methods
|
||||
typeProperties: Map<string, PropertyInfo[]>; // type -> all properties (including inherited)
|
||||
}
|
||||
Loading…
x
Reference in New Issue
Block a user